Apr 19 12:30:36.364767 ip-10-0-129-233 systemd[1]: Starting Kubernetes Kubelet... Apr 19 12:30:36.830619 ip-10-0-129-233 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:36.830619 ip-10-0-129-233 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 19 12:30:36.830619 ip-10-0-129-233 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:36.830619 ip-10-0-129-233 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 19 12:30:36.830619 ip-10-0-129-233 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:36.833141 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.833058 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 19 12:30:36.837864 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837848 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837866 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837871 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837874 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837877 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837879 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837882 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837885 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837887 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837890 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837892 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837896 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837898 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837901 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837903 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837906 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837908 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837911 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837915 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:36.837908 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837918 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837921 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837924 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837926 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837929 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837931 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837934 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837936 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837939 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837942 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837945 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837947 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837949 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837952 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837961 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837964 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837966 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837969 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837972 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837979 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:36.838348 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837982 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837984 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837987 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837989 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837991 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837994 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837996 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.837998 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838001 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838003 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838005 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838008 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838010 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838012 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838015 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838018 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838021 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838023 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838027 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:36.838843 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838031 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838034 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838037 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838039 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838042 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838044 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838047 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838049 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838051 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838054 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838056 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838059 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838061 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838070 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838073 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838076 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838078 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838080 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838083 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838086 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:36.839299 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838089 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838091 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838094 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838097 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838101 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838103 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838105 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838109 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838554 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838560 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838563 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838566 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838569 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838571 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838574 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838577 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838579 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838582 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838584 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838586 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:36.839827 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838589 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838592 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838594 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838597 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838600 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838608 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838611 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838613 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838616 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838619 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838621 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838623 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838626 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838628 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838631 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838633 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838635 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838638 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838640 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838644 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:36.840297 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838647 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838649 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838652 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838654 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838657 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838659 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838661 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838664 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838667 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838669 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838671 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838674 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838676 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838679 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838681 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838683 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838687 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838691 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838699 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:36.840794 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838702 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838704 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838708 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838711 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838715 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838717 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838720 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838723 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838725 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838728 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838730 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838732 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838736 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838738 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838741 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838743 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838746 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838748 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838750 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838753 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:36.841250 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838755 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838758 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838760 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838763 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838768 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838771 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838774 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838776 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838778 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838781 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838783 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838786 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838794 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838796 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.838799 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840780 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840789 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840800 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840804 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840808 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 19 12:30:36.841753 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840811 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840815 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840820 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840823 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840826 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840829 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840833 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840836 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840839 2578 flags.go:64] FLAG: --cgroup-root="" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840842 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840844 2578 flags.go:64] FLAG: --client-ca-file="" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840847 2578 flags.go:64] FLAG: --cloud-config="" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840850 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840852 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840858 2578 flags.go:64] FLAG: --cluster-domain="" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840861 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840865 2578 flags.go:64] FLAG: --config-dir="" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840867 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840871 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840874 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840878 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840881 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840884 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840886 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 19 12:30:36.842230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840889 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840898 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840901 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840904 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840909 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840912 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840915 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840917 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840920 2578 flags.go:64] FLAG: --enable-server="true" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840923 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840930 2578 flags.go:64] FLAG: --event-burst="100" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840933 2578 flags.go:64] FLAG: --event-qps="50" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840936 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840940 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840943 2578 flags.go:64] FLAG: --eviction-hard="" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840946 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840949 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840952 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840955 2578 flags.go:64] FLAG: --eviction-soft="" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840958 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840960 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840963 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840966 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840969 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840971 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 19 12:30:36.842837 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840974 2578 flags.go:64] FLAG: --feature-gates="" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840978 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840981 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840984 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840987 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840990 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840993 2578 flags.go:64] FLAG: --help="false" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840996 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-129-233.ec2.internal" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.840999 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841008 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841010 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841014 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841018 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841020 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841023 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841026 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841029 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841032 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841034 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841038 2578 flags.go:64] FLAG: --kube-reserved="" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841041 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841044 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841047 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841049 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 19 12:30:36.843427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841052 2578 flags.go:64] FLAG: --lock-file="" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841055 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841058 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841061 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841066 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841069 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841071 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841074 2578 flags.go:64] FLAG: --logging-format="text" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841077 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841082 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841085 2578 flags.go:64] FLAG: --manifest-url="" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841087 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841091 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841094 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841098 2578 flags.go:64] FLAG: --max-pods="110" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841101 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841104 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841107 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841121 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841124 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841127 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841129 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841141 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841144 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841147 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 19 12:30:36.844013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841149 2578 flags.go:64] FLAG: --pod-cidr="" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841152 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841158 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841161 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841164 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841167 2578 flags.go:64] FLAG: --port="10250" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841170 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841173 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0afe0bd45a2a12131" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841176 2578 flags.go:64] FLAG: --qos-reserved="" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841178 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841181 2578 flags.go:64] FLAG: --register-node="true" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841184 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841187 2578 flags.go:64] FLAG: --register-with-taints="" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841190 2578 flags.go:64] FLAG: --registry-burst="10" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841193 2578 flags.go:64] FLAG: --registry-qps="5" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841196 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841199 2578 flags.go:64] FLAG: --reserved-memory="" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841203 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841206 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841209 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841212 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841214 2578 flags.go:64] FLAG: --runonce="false" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841217 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841220 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841223 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 19 12:30:36.844661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841225 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841233 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841237 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841239 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841242 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841245 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841248 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841251 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841254 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841257 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841260 2578 flags.go:64] FLAG: --system-cgroups="" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841263 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841268 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841271 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841273 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841281 2578 flags.go:64] FLAG: --tls-min-version="" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841284 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841287 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841289 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841292 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841295 2578 flags.go:64] FLAG: --v="2" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841299 2578 flags.go:64] FLAG: --version="false" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841303 2578 flags.go:64] FLAG: --vmodule="" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841308 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.841311 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 19 12:30:36.845262 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841414 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841417 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841421 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841424 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841427 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841429 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841432 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841435 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841437 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841440 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841452 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841455 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841457 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841460 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841463 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841466 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841468 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841470 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841473 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841487 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:36.845868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841490 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841493 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841495 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841499 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841503 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841507 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841510 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841513 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841517 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841520 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841523 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841525 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841528 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841530 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841533 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841535 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841538 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841541 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841543 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841545 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:36.846364 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841548 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841550 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841553 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841556 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841558 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841561 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841563 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841566 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841568 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841571 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841573 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841576 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841578 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841581 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841583 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841586 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841588 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841591 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841593 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841595 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:36.846868 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841598 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841601 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841603 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841606 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841608 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841611 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841613 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841616 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841618 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841621 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841623 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841625 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841628 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841631 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841634 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841636 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841639 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841641 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841643 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:36.847340 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841646 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:36.847850 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841648 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:36.847850 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841651 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:36.847850 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841653 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:36.847850 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841658 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:36.847850 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841661 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:36.847850 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.841663 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:36.847850 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.842571 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:36.850513 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.850466 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 19 12:30:36.850513 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.850506 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850554 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850559 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850562 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850566 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850569 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850573 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850575 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850578 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850581 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850584 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850587 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850589 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850592 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850594 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850597 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850599 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850602 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850604 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850607 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:36.850628 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850610 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850612 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850614 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850617 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850619 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850622 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850624 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850627 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850630 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850633 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850636 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850639 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850641 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850644 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850647 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850649 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850652 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850654 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850657 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850659 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:36.851109 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850663 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850666 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850669 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850672 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850674 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850677 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850679 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850682 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850684 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850686 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850689 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850691 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850694 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850697 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850699 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850702 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850704 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850707 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850710 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850712 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:36.851603 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850714 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850717 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850720 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850722 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850725 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850727 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850729 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850732 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850735 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850738 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850740 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850742 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850746 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850751 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850753 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850756 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850759 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850763 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850765 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:36.852085 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850768 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850771 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850774 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850776 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850778 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850781 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850784 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850787 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.850792 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850901 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850906 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850909 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850912 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850914 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850917 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850919 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:36.852559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850922 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850924 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850927 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850929 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850932 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850934 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850936 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850939 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850941 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850944 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850946 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850949 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850951 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850953 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850956 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850959 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850961 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850964 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850966 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850969 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:36.852950 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850972 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850974 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850977 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850981 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850985 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850988 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850990 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850993 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850995 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.850998 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851000 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851002 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851005 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851007 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851009 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851012 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851014 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851016 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851019 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:36.853543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851021 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851023 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851026 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851028 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851030 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851033 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851036 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851038 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851042 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851045 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851048 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851050 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851053 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851055 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851058 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851061 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851063 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851066 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851068 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851071 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:36.854037 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851073 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851076 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851078 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851080 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851083 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851085 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851088 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851090 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851093 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851095 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851097 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851100 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851102 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851104 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851107 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851110 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851112 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851115 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851117 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:36.854632 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:36.851120 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:36.855091 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.851124 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:36.855091 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.852183 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 19 12:30:36.855207 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.855194 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 19 12:30:36.856344 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.856331 2578 server.go:1019] "Starting client certificate rotation" Apr 19 12:30:36.856451 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.856433 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:30:36.856526 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.856491 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:30:36.888212 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.888185 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:30:36.891043 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.891024 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:30:36.902285 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.902261 2578 log.go:25] "Validated CRI v1 runtime API" Apr 19 12:30:36.908768 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.908753 2578 log.go:25] "Validated CRI v1 image API" Apr 19 12:30:36.911783 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.911764 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 19 12:30:36.919782 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.919763 2578 fs.go:135] Filesystem UUIDs: map[2c1a7200-05c8-4f72-bb46-6c75eee641a4:/dev/nvme0n1p4 6a68c055-94f2-4bda-ba4d-61f5fefcef07:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 19 12:30:36.919848 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.919781 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 19 12:30:36.921462 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.921439 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:30:36.925784 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.925678 2578 manager.go:217] Machine: {Timestamp:2026-04-19 12:30:36.923463192 +0000 UTC m=+0.441620797 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100322 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fd805b010149fe88c3332c2448d44 SystemUUID:ec2fd805-b010-149f-e88c-3332c2448d44 BootID:45083930-4ab2-47d3-821c-d0e9b7b5801e Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5f:41:e4:64:bf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5f:41:e4:64:bf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b2:16:23:5f:68:8c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 19 12:30:36.925784 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.925780 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 19 12:30:36.925888 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.925851 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 19 12:30:36.927123 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.927102 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 19 12:30:36.927256 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.927125 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-233.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 19 12:30:36.927301 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.927266 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 19 12:30:36.927301 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.927274 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 19 12:30:36.927301 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.927287 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:30:36.928196 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.928186 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:30:36.929182 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.929172 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:30:36.929277 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.929269 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 19 12:30:36.932304 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.932295 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 19 12:30:36.932346 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.932308 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 19 12:30:36.932346 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.932319 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 19 12:30:36.932346 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.932329 2578 kubelet.go:397] "Adding apiserver pod source" Apr 19 12:30:36.932346 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.932338 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 19 12:30:36.933522 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.933509 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:30:36.933590 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.933534 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:30:36.936627 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.936612 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 19 12:30:36.937621 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.937605 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bkfdj" Apr 19 12:30:36.938157 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.938143 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 19 12:30:36.940410 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940397 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940415 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940421 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940426 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940431 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940437 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940442 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940448 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940455 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 19 12:30:36.940463 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940460 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 19 12:30:36.940697 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940474 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 19 12:30:36.940697 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.940499 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 19 12:30:36.941635 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.941620 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 19 12:30:36.941703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.941640 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 19 12:30:36.943177 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:36.943148 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-233.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 19 12:30:36.943325 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:36.943300 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 19 12:30:36.944617 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.944600 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bkfdj" Apr 19 12:30:36.946419 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.946404 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 19 12:30:36.946520 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.946448 2578 server.go:1295] "Started kubelet" Apr 19 12:30:36.946568 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.946535 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 19 12:30:36.946633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.946595 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 19 12:30:36.946671 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.946648 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 19 12:30:36.947293 ip-10-0-129-233 systemd[1]: Started Kubernetes Kubelet. Apr 19 12:30:36.948435 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.948418 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 19 12:30:36.949374 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.949362 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 19 12:30:36.957656 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.957634 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-233.ec2.internal" not found Apr 19 12:30:36.958379 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.958365 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 19 12:30:36.958440 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.958394 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 19 12:30:36.959293 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.959139 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 19 12:30:36.959293 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.959225 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 19 12:30:36.959293 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.959269 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 19 12:30:36.959498 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.959353 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 19 12:30:36.959498 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.959362 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 19 12:30:36.959498 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:36.959362 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 19 12:30:36.960366 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.960337 2578 factory.go:153] Registering CRI-O factory Apr 19 12:30:36.960366 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.960360 2578 factory.go:223] Registration of the crio container factory successfully Apr 19 12:30:36.960527 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.960412 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 19 12:30:36.960527 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.960424 2578 factory.go:55] Registering systemd factory Apr 19 12:30:36.960527 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.960432 2578 factory.go:223] Registration of the systemd container factory successfully Apr 19 12:30:36.960527 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:36.960432 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:36.960527 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.960453 2578 factory.go:103] Registering Raw factory Apr 19 12:30:36.960527 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.960465 2578 manager.go:1196] Started watching for new ooms in manager Apr 19 12:30:36.960724 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.960702 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:36.961069 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.961055 2578 manager.go:319] Starting recovery of all containers Apr 19 12:30:36.961928 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:36.961896 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-233.ec2.internal\" not found" node="ip-10-0-129-233.ec2.internal" Apr 19 12:30:36.971297 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.971190 2578 manager.go:324] Recovery completed Apr 19 12:30:36.971913 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.971897 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-233.ec2.internal" not found Apr 19 12:30:36.975070 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.975053 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:36.977733 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.977718 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:36.977815 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.977747 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:36.977815 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.977758 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:36.978193 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.978178 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 19 12:30:36.978255 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.978193 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 19 12:30:36.978255 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.978211 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:30:36.980883 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.980848 2578 policy_none.go:49] "None policy: Start" Apr 19 12:30:36.980883 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.980866 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 19 12:30:36.980883 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:36.980879 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 19 12:30:37.029056 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.029037 2578 manager.go:341] "Starting Device Plugin manager" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.029070 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.029082 2578 server.go:85] "Starting device plugin registration server" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.029283 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.029304 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.029441 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.029768 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.029780 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.029939 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-233.ec2.internal" not found Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.032226 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 19 12:30:37.033633 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.032260 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.057246 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.057219 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 19 12:30:37.058492 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.058456 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 19 12:30:37.058565 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.058500 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 19 12:30:37.058565 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.058516 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 19 12:30:37.058565 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.058522 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 19 12:30:37.058565 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.058551 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 19 12:30:37.060020 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.060002 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:37.129578 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.129525 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:37.130678 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.130663 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:37.130749 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.130690 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:37.130749 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.130700 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:37.130749 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.130721 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.138294 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.138282 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.138338 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.138300 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-233.ec2.internal\": node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.149352 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.149335 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.159444 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.159413 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal"] Apr 19 12:30:37.159511 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.159495 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:37.160600 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.160586 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:37.160659 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.160614 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:37.160659 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.160624 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:37.161855 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.161844 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:37.162008 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.161994 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.162043 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.162021 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:37.162677 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.162660 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:37.162736 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.162690 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:37.162736 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.162700 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:37.162796 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.162660 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:37.162796 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.162764 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:37.162796 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.162779 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:37.163943 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.163927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.164034 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.163956 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:37.164703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.164689 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:37.164790 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.164711 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:37.164790 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.164721 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:37.187946 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.187924 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-233.ec2.internal\" not found" node="ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.192031 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.192017 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-233.ec2.internal\" not found" node="ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.250206 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.250185 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.260692 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.260670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e51ff7d933e33546c82969e1605121d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal\" (UID: \"7e51ff7d933e33546c82969e1605121d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.260762 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.260698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e51ff7d933e33546c82969e1605121d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal\" (UID: \"7e51ff7d933e33546c82969e1605121d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.260762 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.260713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b2951febf619ea3eaaaa44d21e7bf15f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-233.ec2.internal\" (UID: \"b2951febf619ea3eaaaa44d21e7bf15f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.351210 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.351190 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.361577 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.361555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e51ff7d933e33546c82969e1605121d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal\" (UID: \"7e51ff7d933e33546c82969e1605121d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.361663 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.361589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e51ff7d933e33546c82969e1605121d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal\" (UID: \"7e51ff7d933e33546c82969e1605121d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.361663 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.361614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b2951febf619ea3eaaaa44d21e7bf15f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-233.ec2.internal\" (UID: \"b2951febf619ea3eaaaa44d21e7bf15f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.361663 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.361652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b2951febf619ea3eaaaa44d21e7bf15f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-233.ec2.internal\" (UID: \"b2951febf619ea3eaaaa44d21e7bf15f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.361663 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.361654 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e51ff7d933e33546c82969e1605121d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal\" (UID: \"7e51ff7d933e33546c82969e1605121d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.361805 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.361656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e51ff7d933e33546c82969e1605121d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal\" (UID: \"7e51ff7d933e33546c82969e1605121d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.451936 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.451912 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.490446 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.490415 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.495170 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.495151 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" Apr 19 12:30:37.552062 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.552033 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.652541 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.652521 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.753056 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.753004 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.853632 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.853608 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.855756 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.855735 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 19 12:30:37.855886 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.855871 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:30:37.855937 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.855900 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:30:37.939764 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:37.939732 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2951febf619ea3eaaaa44d21e7bf15f.slice/crio-150c57c43a5217502010a46596a2b25af45689d01884c6fdba1a67ab1d1b682e WatchSource:0}: Error finding container 150c57c43a5217502010a46596a2b25af45689d01884c6fdba1a67ab1d1b682e: Status 404 returned error can't find the container with id 150c57c43a5217502010a46596a2b25af45689d01884c6fdba1a67ab1d1b682e Apr 19 12:30:37.939965 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:37.939943 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e51ff7d933e33546c82969e1605121d.slice/crio-dda229cc5252fc4dcb23bfd28c090b3d0862e69fd9576ad8ef578d9f9ead1d6e WatchSource:0}: Error finding container dda229cc5252fc4dcb23bfd28c090b3d0862e69fd9576ad8ef578d9f9ead1d6e: Status 404 returned error can't find the container with id dda229cc5252fc4dcb23bfd28c090b3d0862e69fd9576ad8ef578d9f9ead1d6e Apr 19 12:30:37.946031 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.945951 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-18 12:25:36 +0000 UTC" deadline="2027-12-06 14:09:53.005331224 +0000 UTC" Apr 19 12:30:37.946031 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.945970 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14305h39m15.059363752s" Apr 19 12:30:37.946445 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.946433 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:30:37.954862 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:37.954847 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:37.959007 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.958993 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 19 12:30:37.967639 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.967620 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:30:37.987256 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.987237 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tx9qd" Apr 19 12:30:37.993225 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:37.993204 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tx9qd" Apr 19 12:30:38.055653 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:38.055608 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:38.061504 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.061438 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" event={"ID":"b2951febf619ea3eaaaa44d21e7bf15f","Type":"ContainerStarted","Data":"150c57c43a5217502010a46596a2b25af45689d01884c6fdba1a67ab1d1b682e"} Apr 19 12:30:38.062420 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.062402 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" event={"ID":"7e51ff7d933e33546c82969e1605121d","Type":"ContainerStarted","Data":"dda229cc5252fc4dcb23bfd28c090b3d0862e69fd9576ad8ef578d9f9ead1d6e"} Apr 19 12:30:38.155696 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:38.155671 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:38.175530 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.175508 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:38.256740 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:38.256715 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-233.ec2.internal\" not found" Apr 19 12:30:38.329779 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.329587 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:38.359725 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.359703 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" Apr 19 12:30:38.369941 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.369833 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:30:38.371199 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.371007 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" Apr 19 12:30:38.378785 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.378694 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:30:38.849572 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.849535 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:38.933927 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.933902 2578 apiserver.go:52] "Watching apiserver" Apr 19 12:30:38.943129 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.943104 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 19 12:30:38.943458 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.943438 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp","openshift-cluster-node-tuning-operator/tuned-7bp2l","openshift-dns/node-resolver-wpnz9","openshift-multus/multus-6rzvv","openshift-multus/multus-additional-cni-plugins-mdpcd","openshift-multus/network-metrics-daemon-7t9j2","openshift-network-diagnostics/network-check-target-pq9ps","kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal","openshift-image-registry/node-ca-fqzjh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal","openshift-network-operator/iptables-alerter-fwdhb","openshift-ovn-kubernetes/ovnkube-node-jrwbt","kube-system/konnectivity-agent-q6bkc"] Apr 19 12:30:38.948089 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.946515 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:38.948089 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:38.946626 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:38.948089 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.948064 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:38.949386 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.949363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.949530 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.949397 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.950326 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.950284 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 19 12:30:38.950751 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.950636 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kxnwd\"" Apr 19 12:30:38.951051 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.951032 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.951798 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.951779 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 19 12:30:38.952242 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.952222 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 19 12:30:38.952326 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.952288 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:38.952384 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:38.952354 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:38.952527 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.952507 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-czpz7\"" Apr 19 12:30:38.952593 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.952568 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 19 12:30:38.952857 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.952840 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:30:38.952857 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.952847 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v8krd\"" Apr 19 12:30:38.953010 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.952872 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 19 12:30:38.953010 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.952844 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 19 12:30:38.953304 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.953288 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 19 12:30:38.953378 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.953312 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 19 12:30:38.953427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.953376 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-97fhp\"" Apr 19 12:30:38.953427 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.953389 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 19 12:30:38.953711 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.953696 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:38.954828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.954812 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:38.955632 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.955612 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 19 12:30:38.955830 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.955813 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 19 12:30:38.955901 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.955870 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 19 12:30:38.955960 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.955813 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8lhqs\"" Apr 19 12:30:38.956284 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.956262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:38.957421 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.957236 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 19 12:30:38.957421 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.957272 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qpff5\"" Apr 19 12:30:38.957421 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.957282 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 19 12:30:38.957629 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.957580 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 19 12:30:38.957779 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.957760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.958301 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.958285 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 19 12:30:38.958428 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.958413 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 19 12:30:38.958515 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.958504 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ltct2\"" Apr 19 12:30:38.958705 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.958689 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:30:38.959116 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.959102 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:38.959745 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.959728 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 19 12:30:38.960200 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.960187 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 19 12:30:38.960685 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.960667 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 19 12:30:38.960765 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.960725 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 19 12:30:38.960943 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.960921 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 19 12:30:38.961185 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.961167 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 19 12:30:38.961824 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.961292 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 19 12:30:38.961824 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.961464 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 19 12:30:38.961824 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.961496 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hjddv\"" Apr 19 12:30:38.961824 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.961657 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vbmv6\"" Apr 19 12:30:38.961824 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.961718 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 19 12:30:38.969806 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.969700 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-hosts-file\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:38.969806 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.969734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:38.969806 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.969760 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kngmd\" (UniqueName: \"kubernetes.io/projected/5511d94c-29bb-45a0-b060-745261d9a2e8-kube-api-access-kngmd\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:38.969806 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.969787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-etc-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.969992 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.969830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f60428bf-3e0e-4720-9fca-76545b47e8e4-host-slash\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:38.969992 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.969918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.969992 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.969951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-cni-netd\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.969992 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.969979 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-tmp\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.970158 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970003 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-device-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:38.970158 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-system-cni-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970158 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-cnibin\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970158 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970075 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-netns\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970158 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-cni-multus\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970158 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-cni-bin\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970158 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-ovnkube-script-lib\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-lib-modules\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-var-lib-kubelet\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cni-binary-copy\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970262 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f60428bf-3e0e-4720-9fca-76545b47e8e4-iptables-alerter-script\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970288 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-etc-kubernetes\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970312 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970350 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74mdk\" (UniqueName: \"kubernetes.io/projected/46c7636d-9cd5-47c0-afaa-e58b27072e37-kube-api-access-74mdk\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-cni-bin\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-ovnkube-config\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970448 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-env-overrides\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-modprobe-d\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970537 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-cni-binary-copy\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-kubelet\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-conf-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-var-lib-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84l6q\" (UniqueName: \"kubernetes.io/projected/dc60d29d-7874-4905-9075-ae159b1131a3-kube-api-access-84l6q\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysconfig\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-systemd\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-multus-certs\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-systemd-units\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt5fn\" (UniqueName: \"kubernetes.io/projected/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-kube-api-access-wt5fn\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970759 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-tmp-dir\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-hostroot\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970823 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz7f8\" (UniqueName: \"kubernetes.io/projected/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-kube-api-access-gz7f8\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc60d29d-7874-4905-9075-ae159b1131a3-ovn-node-metrics-cert\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.970854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-run\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970883 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-etc-selinux\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-system-cni-dir\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-node-log\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.970981 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ea2e728c-ff62-44d4-999e-021181a80e96-konnectivity-ca\") pod \"konnectivity-agent-q6bkc\" (UID: \"ea2e728c-ff62-44d4-999e-021181a80e96\") " pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-sys\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971030 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-tuned\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k7d7\" (UniqueName: \"kubernetes.io/projected/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-kube-api-access-2k7d7\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971080 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-slash\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-kubernetes\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysctl-conf\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971201 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cnibin\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-cni-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5511d94c-29bb-45a0-b060-745261d9a2e8-serviceca\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-systemd\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.971491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-log-socket\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-host\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971414 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtrl\" (UniqueName: \"kubernetes.io/projected/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-kube-api-access-bvtrl\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-daemon-config\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-run-netns\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-ovn\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971539 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-socket-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-registration-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-os-release\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-k8s-cni-cncf-io\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971623 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ea2e728c-ff62-44d4-999e-021181a80e96-agent-certs\") pod \"konnectivity-agent-q6bkc\" (UID: \"ea2e728c-ff62-44d4-999e-021181a80e96\") " pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-os-release\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:38.972093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-socket-dir-parent\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:38.972577 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5511d94c-29bb-45a0-b060-745261d9a2e8-host\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:38.972577 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysctl-d\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:38.972577 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-sys-fs\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:38.972577 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqf76\" (UniqueName: \"kubernetes.io/projected/07fe61c2-3d59-48c1-bd4e-911c6609df12-kube-api-access-pqf76\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:38.972577 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp4m4\" (UniqueName: \"kubernetes.io/projected/f60428bf-3e0e-4720-9fca-76545b47e8e4-kube-api-access-zp4m4\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:38.972577 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.971846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-kubelet\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:38.993871 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.993831 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:25:37 +0000 UTC" deadline="2027-10-12 01:04:13.355639596 +0000 UTC" Apr 19 12:30:38.993934 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:38.993871 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12972h33m34.361771592s" Apr 19 12:30:39.072956 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.072928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f60428bf-3e0e-4720-9fca-76545b47e8e4-host-slash\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:39.073100 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.072970 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.073100 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.072996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-cni-netd\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.073100 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-tmp\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.073100 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-device-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.073100 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073051 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f60428bf-3e0e-4720-9fca-76545b47e8e4-host-slash\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:39.073100 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-system-cni-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-device-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-system-cni-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-cni-netd\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-cnibin\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-netns\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-cni-multus\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-cnibin\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-cni-bin\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-netns\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073302 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-ovnkube-script-lib\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-cni-multus\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-cni-bin\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-lib-modules\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-var-lib-kubelet\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.073359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cni-binary-copy\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f60428bf-3e0e-4720-9fca-76545b47e8e4-iptables-alerter-script\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-var-lib-kubelet\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-lib-modules\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073497 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-etc-kubernetes\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74mdk\" (UniqueName: \"kubernetes.io/projected/46c7636d-9cd5-47c0-afaa-e58b27072e37-kube-api-access-74mdk\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073574 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-etc-kubernetes\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-cni-bin\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-ovnkube-config\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-env-overrides\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-cni-bin\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073728 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-modprobe-d\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-cni-binary-copy\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-kubelet\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-conf-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074054 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-var-lib-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l6q\" (UniqueName: \"kubernetes.io/projected/dc60d29d-7874-4905-9075-ae159b1131a3-kube-api-access-84l6q\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysconfig\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-systemd\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073938 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-multus-certs\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.073983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-systemd-units\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cni-binary-copy\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt5fn\" (UniqueName: \"kubernetes.io/projected/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-kube-api-access-wt5fn\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-tmp-dir\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074086 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-hostroot\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz7f8\" (UniqueName: \"kubernetes.io/projected/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-kube-api-access-gz7f8\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-hostroot\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074417 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-tmp-dir\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc60d29d-7874-4905-9075-ae159b1131a3-ovn-node-metrics-cert\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.074818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-run\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f60428bf-3e0e-4720-9fca-76545b47e8e4-iptables-alerter-script\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-etc-selinux\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-system-cni-dir\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074596 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-ovnkube-script-lib\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-systemd\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-node-log\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074599 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-modprobe-d\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-env-overrides\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074647 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-var-lib-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysconfig\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-multus-certs\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-node-log\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-var-lib-kubelet\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-etc-selinux\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-system-cni-dir\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074770 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-systemd-units\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.075828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074782 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-run\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ea2e728c-ff62-44d4-999e-021181a80e96-konnectivity-ca\") pod \"konnectivity-agent-q6bkc\" (UID: \"ea2e728c-ff62-44d4-999e-021181a80e96\") " pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-sys\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074838 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-conf-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-tuned\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k7d7\" (UniqueName: \"kubernetes.io/projected/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-kube-api-access-2k7d7\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-slash\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.074979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-kubernetes\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysctl-conf\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075036 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cnibin\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-sys\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-cni-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075121 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5511d94c-29bb-45a0-b060-745261d9a2e8-serviceca\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-cni-binary-copy\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-cni-dir\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075201 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-slash\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.076634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075206 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-cnibin\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075250 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-kubernetes\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysctl-conf\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075283 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-systemd\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-log-socket\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-host\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvtrl\" (UniqueName: \"kubernetes.io/projected/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-kube-api-access-bvtrl\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075439 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-daemon-config\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-log-socket\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-run-netns\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-run-netns\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-ovn\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-systemd\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:39.077381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-socket-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075642 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-host\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-registration-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-os-release\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-k8s-cni-cncf-io\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ea2e728c-ff62-44d4-999e-021181a80e96-agent-certs\") pod \"konnectivity-agent-q6bkc\" (UID: \"ea2e728c-ff62-44d4-999e-021181a80e96\") " pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-os-release\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-socket-dir-parent\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-registration-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5511d94c-29bb-45a0-b060-745261d9a2e8-host\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysctl-d\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-sys-fs\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.075993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqf76\" (UniqueName: \"kubernetes.io/projected/07fe61c2-3d59-48c1-bd4e-911c6609df12-kube-api-access-pqf76\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.076023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zp4m4\" (UniqueName: \"kubernetes.io/projected/f60428bf-3e0e-4720-9fca-76545b47e8e4-kube-api-access-zp4m4\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.076050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-kubelet\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.076084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-hosts-file\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:39.077896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.076112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.076136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kngmd\" (UniqueName: \"kubernetes.io/projected/5511d94c-29bb-45a0-b060-745261d9a2e8-kube-api-access-kngmd\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.076176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-etc-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.076643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ea2e728c-ff62-44d4-999e-021181a80e96-konnectivity-ca\") pod \"konnectivity-agent-q6bkc\" (UID: \"ea2e728c-ff62-44d4-999e-021181a80e96\") " pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.076685 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.076842 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:39.57679644 +0000 UTC m=+3.094954055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.076880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-sys-fs\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-host-kubelet\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077290 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-daemon-config\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077397 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-hosts-file\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5511d94c-29bb-45a0-b060-745261d9a2e8-serviceca\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077560 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-etc-openvswitch\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077617 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc60d29d-7874-4905-9075-ae159b1131a3-run-ovn\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077639 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-sysctl-d\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077758 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dc60d29d-7874-4905-9075-ae159b1131a3-ovnkube-config\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.078397 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-os-release\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.079361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-os-release\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.079361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.077935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07fe61c2-3d59-48c1-bd4e-911c6609df12-socket-dir\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.079361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.078000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-host-run-k8s-cni-cncf-io\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.079361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.078021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-multus-socket-dir-parent\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.079361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.078052 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc60d29d-7874-4905-9075-ae159b1131a3-ovn-node-metrics-cert\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.079361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.078230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-etc-tuned\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.079361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.078294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5511d94c-29bb-45a0-b060-745261d9a2e8-host\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:39.079361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.078356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-tmp\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.080361 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.080334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ea2e728c-ff62-44d4-999e-021181a80e96-agent-certs\") pod \"konnectivity-agent-q6bkc\" (UID: \"ea2e728c-ff62-44d4-999e-021181a80e96\") " pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:39.085130 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.084946 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:39.085130 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.084972 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:39.085130 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.084987 2578 projected.go:194] Error preparing data for projected volume kube-api-access-67mbw for pod openshift-network-diagnostics/network-check-target-pq9ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:39.085130 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.085110 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw podName:6189177e-28b2-4186-81ad-531fed1d1ada nodeName:}" failed. No retries permitted until 2026-04-19 12:30:39.585092744 +0000 UTC m=+3.103250350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-67mbw" (UniqueName: "kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw") pod "network-check-target-pq9ps" (UID: "6189177e-28b2-4186-81ad-531fed1d1ada") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:39.085637 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.085430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k7d7\" (UniqueName: \"kubernetes.io/projected/f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0-kube-api-access-2k7d7\") pod \"node-resolver-wpnz9\" (UID: \"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0\") " pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:39.086226 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.086070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz7f8\" (UniqueName: \"kubernetes.io/projected/c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f-kube-api-access-gz7f8\") pod \"multus-6rzvv\" (UID: \"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f\") " pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.086468 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.086445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt5fn\" (UniqueName: \"kubernetes.io/projected/64992d8d-7ceb-4b79-86e0-9d18a07eb0fe-kube-api-access-wt5fn\") pod \"tuned-7bp2l\" (UID: \"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.088119 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.088059 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp4m4\" (UniqueName: \"kubernetes.io/projected/f60428bf-3e0e-4720-9fca-76545b47e8e4-kube-api-access-zp4m4\") pod \"iptables-alerter-fwdhb\" (UID: \"f60428bf-3e0e-4720-9fca-76545b47e8e4\") " pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:39.088119 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.088069 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84l6q\" (UniqueName: \"kubernetes.io/projected/dc60d29d-7874-4905-9075-ae159b1131a3-kube-api-access-84l6q\") pod \"ovnkube-node-jrwbt\" (UID: \"dc60d29d-7874-4905-9075-ae159b1131a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.088340 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.088319 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kngmd\" (UniqueName: \"kubernetes.io/projected/5511d94c-29bb-45a0-b060-745261d9a2e8-kube-api-access-kngmd\") pod \"node-ca-fqzjh\" (UID: \"5511d94c-29bb-45a0-b060-745261d9a2e8\") " pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:39.088687 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.088668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvtrl\" (UniqueName: \"kubernetes.io/projected/59d9fb8b-c8ff-4890-ae51-0f7fa04e6865-kube-api-access-bvtrl\") pod \"multus-additional-cni-plugins-mdpcd\" (UID: \"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865\") " pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.089512 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.089493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqf76\" (UniqueName: \"kubernetes.io/projected/07fe61c2-3d59-48c1-bd4e-911c6609df12-kube-api-access-pqf76\") pod \"aws-ebs-csi-driver-node-wl9zp\" (UID: \"07fe61c2-3d59-48c1-bd4e-911c6609df12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.089656 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.089630 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74mdk\" (UniqueName: \"kubernetes.io/projected/46c7636d-9cd5-47c0-afaa-e58b27072e37-kube-api-access-74mdk\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:39.261333 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.261295 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wpnz9" Apr 19 12:30:39.267191 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.267170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" Apr 19 12:30:39.274742 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.274725 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6rzvv" Apr 19 12:30:39.279358 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.279339 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" Apr 19 12:30:39.286013 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.285996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" Apr 19 12:30:39.292542 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.292527 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fqzjh" Apr 19 12:30:39.299020 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.299004 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fwdhb" Apr 19 12:30:39.305329 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.305312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:30:39.309902 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.309885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:39.380372 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.380350 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:39.511325 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.511225 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf60428bf_3e0e_4720_9fca_76545b47e8e4.slice/crio-f40286428a556a2ee48dd5fbe5ead9bb62e19f2aff2d57b845dee32248e12e86 WatchSource:0}: Error finding container f40286428a556a2ee48dd5fbe5ead9bb62e19f2aff2d57b845dee32248e12e86: Status 404 returned error can't find the container with id f40286428a556a2ee48dd5fbe5ead9bb62e19f2aff2d57b845dee32248e12e86 Apr 19 12:30:39.515011 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.514946 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea2e728c_ff62_44d4_999e_021181a80e96.slice/crio-4753e28ff732a0a172fbe07c8964b4bf2728fa1ce62b61e5bebddf0d1012d82e WatchSource:0}: Error finding container 4753e28ff732a0a172fbe07c8964b4bf2728fa1ce62b61e5bebddf0d1012d82e: Status 404 returned error can't find the container with id 4753e28ff732a0a172fbe07c8964b4bf2728fa1ce62b61e5bebddf0d1012d82e Apr 19 12:30:39.517034 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.517017 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59d9fb8b_c8ff_4890_ae51_0f7fa04e6865.slice/crio-f66d64f3bfefd0277b29317a3224d123d9132a7fd89951d0ac373f1a5d07fe89 WatchSource:0}: Error finding container f66d64f3bfefd0277b29317a3224d123d9132a7fd89951d0ac373f1a5d07fe89: Status 404 returned error can't find the container with id f66d64f3bfefd0277b29317a3224d123d9132a7fd89951d0ac373f1a5d07fe89 Apr 19 12:30:39.518410 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.518293 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56e0b8d_b9f5_437d_95c9_cd46b8dbcea0.slice/crio-19f0c3047e9020cb36e170fcc3083e12f2de0ddfdc04bc924d0bae40b7f2177c WatchSource:0}: Error finding container 19f0c3047e9020cb36e170fcc3083e12f2de0ddfdc04bc924d0bae40b7f2177c: Status 404 returned error can't find the container with id 19f0c3047e9020cb36e170fcc3083e12f2de0ddfdc04bc924d0bae40b7f2177c Apr 19 12:30:39.519352 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.519331 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64992d8d_7ceb_4b79_86e0_9d18a07eb0fe.slice/crio-bfc226773282aa4c692731b6be8187fa1d576eeca6e3c96b00260414e1716e64 WatchSource:0}: Error finding container bfc226773282aa4c692731b6be8187fa1d576eeca6e3c96b00260414e1716e64: Status 404 returned error can't find the container with id bfc226773282aa4c692731b6be8187fa1d576eeca6e3c96b00260414e1716e64 Apr 19 12:30:39.520466 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.520428 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc60d29d_7874_4905_9075_ae159b1131a3.slice/crio-e7f086b76872754687a372909a9169c66c531a38f0eb7638ba970ace526413ab WatchSource:0}: Error finding container e7f086b76872754687a372909a9169c66c531a38f0eb7638ba970ace526413ab: Status 404 returned error can't find the container with id e7f086b76872754687a372909a9169c66c531a38f0eb7638ba970ace526413ab Apr 19 12:30:39.522181 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.521588 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07fe61c2_3d59_48c1_bd4e_911c6609df12.slice/crio-5216d729d283827eb7578db8b3858619cbf0541d8434f54a26a024b6214e5298 WatchSource:0}: Error finding container 5216d729d283827eb7578db8b3858619cbf0541d8434f54a26a024b6214e5298: Status 404 returned error can't find the container with id 5216d729d283827eb7578db8b3858619cbf0541d8434f54a26a024b6214e5298 Apr 19 12:30:39.523779 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.523562 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc762d1dd_0bb7_4a1e_83a7_5a20dfd1674f.slice/crio-3e5970ac283a9b9bbb2e82702cf40baf575d00a71476e8bc390e88fed7cac36d WatchSource:0}: Error finding container 3e5970ac283a9b9bbb2e82702cf40baf575d00a71476e8bc390e88fed7cac36d: Status 404 returned error can't find the container with id 3e5970ac283a9b9bbb2e82702cf40baf575d00a71476e8bc390e88fed7cac36d Apr 19 12:30:39.524022 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:30:39.523998 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5511d94c_29bb_45a0_b060_745261d9a2e8.slice/crio-8c65c681dc2a5cea8cb0abcfc4b5161f24c327ee9beb70f1110cf6b1dbbad1ad WatchSource:0}: Error finding container 8c65c681dc2a5cea8cb0abcfc4b5161f24c327ee9beb70f1110cf6b1dbbad1ad: Status 404 returned error can't find the container with id 8c65c681dc2a5cea8cb0abcfc4b5161f24c327ee9beb70f1110cf6b1dbbad1ad Apr 19 12:30:39.580676 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.580655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:39.580801 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.580785 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:39.580848 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.580842 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:40.580826299 +0000 UTC m=+4.098983890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:39.681351 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.681324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:39.681520 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.681428 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:39.681520 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.681450 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:39.681520 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.681459 2578 projected.go:194] Error preparing data for projected volume kube-api-access-67mbw for pod openshift-network-diagnostics/network-check-target-pq9ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:39.681520 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:39.681511 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw podName:6189177e-28b2-4186-81ad-531fed1d1ada nodeName:}" failed. No retries permitted until 2026-04-19 12:30:40.681499232 +0000 UTC m=+4.199656823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-67mbw" (UniqueName: "kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw") pod "network-check-target-pq9ps" (UID: "6189177e-28b2-4186-81ad-531fed1d1ada") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:39.994119 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.994081 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:25:37 +0000 UTC" deadline="2028-01-22 11:01:28.110066532 +0000 UTC" Apr 19 12:30:39.994119 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:39.994116 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15430h30m48.115954069s" Apr 19 12:30:40.058895 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.058865 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:40.059054 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:40.059000 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:40.077398 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.077365 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fqzjh" event={"ID":"5511d94c-29bb-45a0-b060-745261d9a2e8","Type":"ContainerStarted","Data":"8c65c681dc2a5cea8cb0abcfc4b5161f24c327ee9beb70f1110cf6b1dbbad1ad"} Apr 19 12:30:40.082075 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.082028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" event={"ID":"07fe61c2-3d59-48c1-bd4e-911c6609df12","Type":"ContainerStarted","Data":"5216d729d283827eb7578db8b3858619cbf0541d8434f54a26a024b6214e5298"} Apr 19 12:30:40.089722 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.089691 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" event={"ID":"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe","Type":"ContainerStarted","Data":"bfc226773282aa4c692731b6be8187fa1d576eeca6e3c96b00260414e1716e64"} Apr 19 12:30:40.099537 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.099510 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wpnz9" event={"ID":"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0","Type":"ContainerStarted","Data":"19f0c3047e9020cb36e170fcc3083e12f2de0ddfdc04bc924d0bae40b7f2177c"} Apr 19 12:30:40.110035 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.110008 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q6bkc" event={"ID":"ea2e728c-ff62-44d4-999e-021181a80e96","Type":"ContainerStarted","Data":"4753e28ff732a0a172fbe07c8964b4bf2728fa1ce62b61e5bebddf0d1012d82e"} Apr 19 12:30:40.120762 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.120711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fwdhb" event={"ID":"f60428bf-3e0e-4720-9fca-76545b47e8e4","Type":"ContainerStarted","Data":"f40286428a556a2ee48dd5fbe5ead9bb62e19f2aff2d57b845dee32248e12e86"} Apr 19 12:30:40.128375 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.127907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" event={"ID":"b2951febf619ea3eaaaa44d21e7bf15f","Type":"ContainerStarted","Data":"3b23905223bc23070464c911b3cb181edcfae85cdb35bc9fe6ac3b7b6b518505"} Apr 19 12:30:40.134192 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.134164 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"e7f086b76872754687a372909a9169c66c531a38f0eb7638ba970ace526413ab"} Apr 19 12:30:40.141555 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.141529 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerStarted","Data":"f66d64f3bfefd0277b29317a3224d123d9132a7fd89951d0ac373f1a5d07fe89"} Apr 19 12:30:40.143579 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.143555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rzvv" event={"ID":"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f","Type":"ContainerStarted","Data":"3e5970ac283a9b9bbb2e82702cf40baf575d00a71476e8bc390e88fed7cac36d"} Apr 19 12:30:40.588913 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.588864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:40.589064 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:40.589045 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:40.589131 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:40.589118 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:42.589097929 +0000 UTC m=+6.107255522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:40.690369 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:40.689796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:40.690369 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:40.689946 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:40.690369 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:40.689963 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:40.690369 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:40.689977 2578 projected.go:194] Error preparing data for projected volume kube-api-access-67mbw for pod openshift-network-diagnostics/network-check-target-pq9ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:40.690369 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:40.690052 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw podName:6189177e-28b2-4186-81ad-531fed1d1ada nodeName:}" failed. No retries permitted until 2026-04-19 12:30:42.690033489 +0000 UTC m=+6.208191095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-67mbw" (UniqueName: "kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw") pod "network-check-target-pq9ps" (UID: "6189177e-28b2-4186-81ad-531fed1d1ada") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:41.061643 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:41.061609 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:41.062181 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:41.062149 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:41.154766 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:41.153658 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e51ff7d933e33546c82969e1605121d" containerID="777a443d5c162372790c65a784e03a5f969eace9d7a2dc68ebf9933540a1c2dc" exitCode=0 Apr 19 12:30:41.154766 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:41.154559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" event={"ID":"7e51ff7d933e33546c82969e1605121d","Type":"ContainerDied","Data":"777a443d5c162372790c65a784e03a5f969eace9d7a2dc68ebf9933540a1c2dc"} Apr 19 12:30:41.166392 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:41.166331 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-233.ec2.internal" podStartSLOduration=3.166315623 podStartE2EDuration="3.166315623s" podCreationTimestamp="2026-04-19 12:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:30:40.141617113 +0000 UTC m=+3.659774725" watchObservedRunningTime="2026-04-19 12:30:41.166315623 +0000 UTC m=+4.684473240" Apr 19 12:30:42.059240 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:42.059207 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:42.059398 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:42.059345 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:42.162028 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:42.161992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" event={"ID":"7e51ff7d933e33546c82969e1605121d","Type":"ContainerStarted","Data":"75ce7c17ff0dc27e207fecaaa390ebca7f3b3b8f735cf484a4ddf6c74ef35723"} Apr 19 12:30:42.604733 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:42.604694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:42.604900 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:42.604863 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:42.604977 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:42.604918 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:46.604903231 +0000 UTC m=+10.123060822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:42.705627 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:42.705587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:42.705872 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:42.705757 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:42.705872 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:42.705773 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:42.705872 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:42.705785 2578 projected.go:194] Error preparing data for projected volume kube-api-access-67mbw for pod openshift-network-diagnostics/network-check-target-pq9ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:42.705872 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:42.705847 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw podName:6189177e-28b2-4186-81ad-531fed1d1ada nodeName:}" failed. No retries permitted until 2026-04-19 12:30:46.705832851 +0000 UTC m=+10.223990441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-67mbw" (UniqueName: "kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw") pod "network-check-target-pq9ps" (UID: "6189177e-28b2-4186-81ad-531fed1d1ada") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:43.060770 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:43.060733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:43.060949 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:43.060894 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:44.059163 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:44.059023 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:44.059163 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:44.059156 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:45.059733 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.059698 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:45.060188 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:45.059841 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:45.561870 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.561793 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-233.ec2.internal" podStartSLOduration=7.561773581 podStartE2EDuration="7.561773581s" podCreationTimestamp="2026-04-19 12:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:30:42.174996918 +0000 UTC m=+5.693154592" watchObservedRunningTime="2026-04-19 12:30:45.561773581 +0000 UTC m=+9.079931197" Apr 19 12:30:45.562392 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.562372 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bjv2v"] Apr 19 12:30:45.569136 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.568808 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:45.569136 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:45.568899 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:30:45.628966 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.628926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/635b706c-4824-43f2-9e8f-fed36e897e9b-kubelet-config\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:45.629116 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.628976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/635b706c-4824-43f2-9e8f-fed36e897e9b-dbus\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:45.629171 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.629116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:45.729769 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.729735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:45.729932 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.729790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/635b706c-4824-43f2-9e8f-fed36e897e9b-kubelet-config\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:45.729932 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.729814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/635b706c-4824-43f2-9e8f-fed36e897e9b-dbus\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:45.730046 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.729980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/635b706c-4824-43f2-9e8f-fed36e897e9b-dbus\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:45.730097 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:45.730079 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:45.730150 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:45.730137 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret podName:635b706c-4824-43f2-9e8f-fed36e897e9b nodeName:}" failed. No retries permitted until 2026-04-19 12:30:46.230118795 +0000 UTC m=+9.748276401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret") pod "global-pull-secret-syncer-bjv2v" (UID: "635b706c-4824-43f2-9e8f-fed36e897e9b") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:45.730394 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:45.730373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/635b706c-4824-43f2-9e8f-fed36e897e9b-kubelet-config\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:46.059226 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:46.059199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:46.059376 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.059297 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:46.235055 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:46.235021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:46.235469 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.235184 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:46.235469 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.235260 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret podName:635b706c-4824-43f2-9e8f-fed36e897e9b nodeName:}" failed. No retries permitted until 2026-04-19 12:30:47.235239727 +0000 UTC m=+10.753397323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret") pod "global-pull-secret-syncer-bjv2v" (UID: "635b706c-4824-43f2-9e8f-fed36e897e9b") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:46.639121 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:46.639087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:46.639299 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.639254 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:46.639352 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.639318 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:54.639294868 +0000 UTC m=+18.157452463 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:46.739723 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:46.739687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:46.739883 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.739859 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:46.739883 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.739881 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:46.739994 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.739895 2578 projected.go:194] Error preparing data for projected volume kube-api-access-67mbw for pod openshift-network-diagnostics/network-check-target-pq9ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:46.739994 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:46.739958 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw podName:6189177e-28b2-4186-81ad-531fed1d1ada nodeName:}" failed. No retries permitted until 2026-04-19 12:30:54.739939147 +0000 UTC m=+18.258096740 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-67mbw" (UniqueName: "kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw") pod "network-check-target-pq9ps" (UID: "6189177e-28b2-4186-81ad-531fed1d1ada") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:47.060466 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:47.060438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:47.060630 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:47.060594 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:30:47.060705 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:47.060644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:47.060803 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:47.060759 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:47.246210 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:47.246175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:47.246685 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:47.246294 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:47.246685 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:47.246358 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret podName:635b706c-4824-43f2-9e8f-fed36e897e9b nodeName:}" failed. No retries permitted until 2026-04-19 12:30:49.24633807 +0000 UTC m=+12.764495661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret") pod "global-pull-secret-syncer-bjv2v" (UID: "635b706c-4824-43f2-9e8f-fed36e897e9b") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:48.059722 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:48.059692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:48.059898 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:48.059817 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:49.062041 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:49.061967 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:49.062369 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:49.061967 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:49.062369 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:49.062060 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:30:49.062369 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:49.062143 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:49.260944 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:49.260912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:49.261116 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:49.261052 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:49.261116 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:49.261112 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret podName:635b706c-4824-43f2-9e8f-fed36e897e9b nodeName:}" failed. No retries permitted until 2026-04-19 12:30:53.261092912 +0000 UTC m=+16.779250543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret") pod "global-pull-secret-syncer-bjv2v" (UID: "635b706c-4824-43f2-9e8f-fed36e897e9b") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:50.059643 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:50.059613 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:50.059773 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:50.059712 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:51.058894 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:51.058860 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:51.058894 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:51.058883 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:51.059376 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:51.059004 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:51.059376 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:51.059156 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:30:52.059605 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:52.059569 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:52.060050 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:52.059688 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:53.062336 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:53.062303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:53.062770 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:53.062303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:53.062770 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:53.062430 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:53.062770 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:53.062467 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:30:53.290834 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:53.290791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:53.290978 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:53.290952 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:53.291034 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:53.291020 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret podName:635b706c-4824-43f2-9e8f-fed36e897e9b nodeName:}" failed. No retries permitted until 2026-04-19 12:31:01.291003912 +0000 UTC m=+24.809161521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret") pod "global-pull-secret-syncer-bjv2v" (UID: "635b706c-4824-43f2-9e8f-fed36e897e9b") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:54.059651 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:54.059624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:54.059812 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:54.059725 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:54.700664 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:54.700631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:54.701107 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:54.700801 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:54.701107 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:54.700882 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:10.700860759 +0000 UTC m=+34.219018356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:54.801939 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:54.801904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:54.802083 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:54.802052 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:54.802083 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:54.802073 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:54.802083 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:54.802083 2578 projected.go:194] Error preparing data for projected volume kube-api-access-67mbw for pod openshift-network-diagnostics/network-check-target-pq9ps: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:54.802198 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:54.802136 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw podName:6189177e-28b2-4186-81ad-531fed1d1ada nodeName:}" failed. No retries permitted until 2026-04-19 12:31:10.802117873 +0000 UTC m=+34.320275464 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-67mbw" (UniqueName: "kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw") pod "network-check-target-pq9ps" (UID: "6189177e-28b2-4186-81ad-531fed1d1ada") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:55.059146 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:55.059059 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:55.059289 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:55.059181 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:55.059289 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:55.059243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:55.059390 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:55.059358 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:30:56.059215 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:56.059184 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:56.059713 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:56.059313 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:57.060958 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.060616 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:57.060958 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:57.060908 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:30:57.064557 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.062177 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:57.064557 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:57.062294 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:57.188509 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.188462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rzvv" event={"ID":"c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f","Type":"ContainerStarted","Data":"ead60122bc38e0319a7b7f4c2dffb23ea7bd8e27133fa6852108498c42f95dcb"} Apr 19 12:30:57.189627 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.189601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fqzjh" event={"ID":"5511d94c-29bb-45a0-b060-745261d9a2e8","Type":"ContainerStarted","Data":"975fdb284cae4bd960d73b937843b43df75f1dab056314c6e8a8f2b41ef865be"} Apr 19 12:30:57.190595 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.190576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" event={"ID":"07fe61c2-3d59-48c1-bd4e-911c6609df12","Type":"ContainerStarted","Data":"8fdc5c5d482d0a8261b1ee64245a416a5a64705a6ef78dea55ff62211225563e"} Apr 19 12:30:57.191571 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.191551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" event={"ID":"64992d8d-7ceb-4b79-86e0-9d18a07eb0fe","Type":"ContainerStarted","Data":"d1417a8f46c6f6e9c77f19c8a19bb9a7f3a74e95b680e07765a3bd539156f9f6"} Apr 19 12:30:57.192553 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.192536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wpnz9" event={"ID":"f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0","Type":"ContainerStarted","Data":"508c24d1e1cb65fc01ade896d8e7ceaaa110805cbce6ff38582423ba96b75f77"} Apr 19 12:30:57.193544 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.193524 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q6bkc" event={"ID":"ea2e728c-ff62-44d4-999e-021181a80e96","Type":"ContainerStarted","Data":"a36872cec2cecae6de9351bf320abfc544b24d682f6da6feecfcc454fc754349"} Apr 19 12:30:57.194578 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.194559 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:30:57.194965 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.194940 2578 generic.go:358] "Generic (PLEG): container finished" podID="dc60d29d-7874-4905-9075-ae159b1131a3" containerID="2f5fbddad0d6c15d1bf21137c392f233b8c2321116d508892c469008166b69dc" exitCode=1 Apr 19 12:30:57.195024 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.195009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerDied","Data":"2f5fbddad0d6c15d1bf21137c392f233b8c2321116d508892c469008166b69dc"} Apr 19 12:30:57.195078 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.195031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"ddb877f4eccc300964ed538dd0189a682e70ad3bd66d257fa7e4089fc178d858"} Apr 19 12:30:57.196498 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.196464 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerStarted","Data":"510f14192fb5c0d53bab9b56f1da87f3e10332b8991acae44e6c5085c78c6cc9"} Apr 19 12:30:57.207909 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.207861 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fqzjh" podStartSLOduration=2.9133677430000002 podStartE2EDuration="20.207844109s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.52610823 +0000 UTC m=+3.044265820" lastFinishedPulling="2026-04-19 12:30:56.820584576 +0000 UTC m=+20.338742186" observedRunningTime="2026-04-19 12:30:57.20720078 +0000 UTC m=+20.725358394" watchObservedRunningTime="2026-04-19 12:30:57.207844109 +0000 UTC m=+20.726001722" Apr 19 12:30:57.224426 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.224378 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7bp2l" podStartSLOduration=2.909338389 podStartE2EDuration="20.224364102s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.521062225 +0000 UTC m=+3.039219822" lastFinishedPulling="2026-04-19 12:30:56.836087926 +0000 UTC m=+20.354245535" observedRunningTime="2026-04-19 12:30:57.224045221 +0000 UTC m=+20.742202834" watchObservedRunningTime="2026-04-19 12:30:57.224364102 +0000 UTC m=+20.742521714" Apr 19 12:30:57.242541 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.242471 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6rzvv" podStartSLOduration=2.921009651 podStartE2EDuration="20.242455603s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.525743385 +0000 UTC m=+3.043900980" lastFinishedPulling="2026-04-19 12:30:56.84718934 +0000 UTC m=+20.365346932" observedRunningTime="2026-04-19 12:30:57.241689039 +0000 UTC m=+20.759846652" watchObservedRunningTime="2026-04-19 12:30:57.242455603 +0000 UTC m=+20.760613215" Apr 19 12:30:57.277669 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.277620 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q6bkc" podStartSLOduration=3.257840886 podStartE2EDuration="20.277601788s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.51655959 +0000 UTC m=+3.034717180" lastFinishedPulling="2026-04-19 12:30:56.536320479 +0000 UTC m=+20.054478082" observedRunningTime="2026-04-19 12:30:57.276790293 +0000 UTC m=+20.794947906" watchObservedRunningTime="2026-04-19 12:30:57.277601788 +0000 UTC m=+20.795759409" Apr 19 12:30:57.292225 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:57.292176 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wpnz9" podStartSLOduration=2.971015897 podStartE2EDuration="20.292160945s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.52016873 +0000 UTC m=+3.038326323" lastFinishedPulling="2026-04-19 12:30:56.841313776 +0000 UTC m=+20.359471371" observedRunningTime="2026-04-19 12:30:57.291728522 +0000 UTC m=+20.809886134" watchObservedRunningTime="2026-04-19 12:30:57.292160945 +0000 UTC m=+20.810318558" Apr 19 12:30:58.058802 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.058637 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:30:58.058946 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:58.058864 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:30:58.199580 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.199549 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fwdhb" event={"ID":"f60428bf-3e0e-4720-9fca-76545b47e8e4","Type":"ContainerStarted","Data":"594c454b3a67e1d29cbf75d4170999f7abd90188b22a5195ced02a21ade4a760"} Apr 19 12:30:58.201917 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.201900 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:30:58.202223 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.202200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"b6063bc03cc0ec60aa0fb8ee651a6b6fdfcc180809613c1040f6693ddf56b434"} Apr 19 12:30:58.202303 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.202232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"19085ec6ddcff6f62e95c097911c8193848f4531d4000a438d1a2adeb680caee"} Apr 19 12:30:58.202303 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.202245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"b4a39054a8621ff2aafd07bf21cc5a24088557c44f1802b0820dc98185224a95"} Apr 19 12:30:58.202303 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.202258 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"aed8c826bf42e8c529be5eedc615b42af92d540872238b12085cfaf199c15cbd"} Apr 19 12:30:58.203440 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.203416 2578 generic.go:358] "Generic (PLEG): container finished" podID="59d9fb8b-c8ff-4890-ae51-0f7fa04e6865" containerID="510f14192fb5c0d53bab9b56f1da87f3e10332b8991acae44e6c5085c78c6cc9" exitCode=0 Apr 19 12:30:58.203579 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.203541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerDied","Data":"510f14192fb5c0d53bab9b56f1da87f3e10332b8991acae44e6c5085c78c6cc9"} Apr 19 12:30:58.216582 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.216539 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fwdhb" podStartSLOduration=3.909876466 podStartE2EDuration="21.216524698s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.513762293 +0000 UTC m=+3.031919884" lastFinishedPulling="2026-04-19 12:30:56.820410519 +0000 UTC m=+20.338568116" observedRunningTime="2026-04-19 12:30:58.215956077 +0000 UTC m=+21.734113689" watchObservedRunningTime="2026-04-19 12:30:58.216524698 +0000 UTC m=+21.734682315" Apr 19 12:30:58.385832 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:58.385806 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 19 12:30:59.042225 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:59.042134 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-19T12:30:58.385828135Z","UUID":"3a25da27-612e-4543-b481-f97ddc2af0df","Handler":null,"Name":"","Endpoint":""} Apr 19 12:30:59.044752 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:59.044399 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 19 12:30:59.044752 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:59.044428 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 19 12:30:59.059203 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:59.059089 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:30:59.059203 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:59.059118 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:30:59.059362 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:59.059218 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:30:59.059584 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:30:59.059544 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:30:59.206745 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:59.206717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" event={"ID":"07fe61c2-3d59-48c1-bd4e-911c6609df12","Type":"ContainerStarted","Data":"2f065afbf21cc5e3561e0318c0cfea035b027fb2eed5ceafbb065a3d2f92debe"} Apr 19 12:30:59.503470 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:59.503431 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:30:59.504044 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:30:59.504023 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:31:00.059069 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:00.058806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:00.059214 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:00.059099 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:31:00.210321 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:00.210291 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" event={"ID":"07fe61c2-3d59-48c1-bd4e-911c6609df12","Type":"ContainerStarted","Data":"538453d0e2789f1f684c8d00b96409e416faa8a035da5575565fd6c1e3c1cff1"} Apr 19 12:31:00.213202 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:00.213124 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:31:00.213517 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:00.213469 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"6ad9fe84c489ce62879fde958b38767ec0480e6fce837286d7c7f7a97d7f8a64"} Apr 19 12:31:00.213796 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:00.213773 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:31:00.214283 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:00.214268 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q6bkc" Apr 19 12:31:00.232412 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:00.232373 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wl9zp" podStartSLOduration=3.095487156 podStartE2EDuration="23.232362351s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.525674629 +0000 UTC m=+3.043832235" lastFinishedPulling="2026-04-19 12:30:59.662549838 +0000 UTC m=+23.180707430" observedRunningTime="2026-04-19 12:31:00.231882819 +0000 UTC m=+23.750040433" watchObservedRunningTime="2026-04-19 12:31:00.232362351 +0000 UTC m=+23.750519964" Apr 19 12:31:01.059404 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:01.059373 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:31:01.059404 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:01.059389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:01.059677 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:01.059529 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:31:01.059677 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:01.059650 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:31:01.353392 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:01.353324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:01.353907 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:01.353427 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:31:01.353907 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:01.353506 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret podName:635b706c-4824-43f2-9e8f-fed36e897e9b nodeName:}" failed. No retries permitted until 2026-04-19 12:31:17.353472946 +0000 UTC m=+40.871630538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret") pod "global-pull-secret-syncer-bjv2v" (UID: "635b706c-4824-43f2-9e8f-fed36e897e9b") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:31:02.058685 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:02.058653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:02.058822 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:02.058768 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:31:03.059278 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.059122 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:03.059809 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.059195 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:31:03.059809 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:03.059377 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:31:03.059809 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:03.059505 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:31:03.220767 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.220747 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:31:03.221108 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.221084 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"b07be100d57bcf43ae0459800e96dd37b0032f0cfb783a3cf2fc497559ccc0f5"} Apr 19 12:31:03.221444 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.221409 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:31:03.221553 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.221452 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:31:03.221553 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.221466 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:31:03.221668 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.221644 2578 scope.go:117] "RemoveContainer" containerID="2f5fbddad0d6c15d1bf21137c392f233b8c2321116d508892c469008166b69dc" Apr 19 12:31:03.223124 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.223095 2578 generic.go:358] "Generic (PLEG): container finished" podID="59d9fb8b-c8ff-4890-ae51-0f7fa04e6865" containerID="83ff0015c994458fbd5fb810948f054816e097b7479cb8046a7ae9ecee7c11ae" exitCode=0 Apr 19 12:31:03.223222 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.223141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerDied","Data":"83ff0015c994458fbd5fb810948f054816e097b7479cb8046a7ae9ecee7c11ae"} Apr 19 12:31:03.236605 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.236581 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:31:03.237224 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:03.237209 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:31:04.059601 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.059580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:04.059887 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:04.059658 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:31:04.234634 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.234606 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:31:04.235524 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.235495 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" event={"ID":"dc60d29d-7874-4905-9075-ae159b1131a3","Type":"ContainerStarted","Data":"1b6801194c2cfef2650cd5df2b630f061e40117a35f37e242b9241d5e1cf8b61"} Apr 19 12:31:04.237625 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.237602 2578 generic.go:358] "Generic (PLEG): container finished" podID="59d9fb8b-c8ff-4890-ae51-0f7fa04e6865" containerID="c1c0084fd4ffe9d0ee4f525cdc8f41fcf777331cb1e324e69c73aa827775ce4f" exitCode=0 Apr 19 12:31:04.237756 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.237641 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerDied","Data":"c1c0084fd4ffe9d0ee4f525cdc8f41fcf777331cb1e324e69c73aa827775ce4f"} Apr 19 12:31:04.263948 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.263904 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" podStartSLOduration=9.915336779 podStartE2EDuration="27.263892301s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.523575594 +0000 UTC m=+3.041733202" lastFinishedPulling="2026-04-19 12:30:56.87213113 +0000 UTC m=+20.390288724" observedRunningTime="2026-04-19 12:31:04.263286781 +0000 UTC m=+27.781444386" watchObservedRunningTime="2026-04-19 12:31:04.263892301 +0000 UTC m=+27.782049914" Apr 19 12:31:04.875184 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.875154 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pq9ps"] Apr 19 12:31:04.875304 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.875289 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:04.875424 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:04.875402 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:31:04.877698 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.877677 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bjv2v"] Apr 19 12:31:04.877790 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.877781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:04.877894 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:04.877874 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:31:04.886189 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.886168 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7t9j2"] Apr 19 12:31:04.886289 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:04.886267 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:31:04.886386 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:04.886364 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:31:05.241806 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:05.241575 2578 generic.go:358] "Generic (PLEG): container finished" podID="59d9fb8b-c8ff-4890-ae51-0f7fa04e6865" containerID="5d41defdb33245bb8bf028074a804cce726546d00be498dffa8e62a6dfa25258" exitCode=0 Apr 19 12:31:05.242120 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:05.241651 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerDied","Data":"5d41defdb33245bb8bf028074a804cce726546d00be498dffa8e62a6dfa25258"} Apr 19 12:31:06.059430 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:06.059400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:06.059576 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:06.059550 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:31:07.060227 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:07.060192 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:07.060762 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:07.060299 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:31:07.060762 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:07.060344 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:31:07.060762 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:07.060455 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:31:08.059455 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:08.059424 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:08.059663 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:08.059545 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq9ps" podUID="6189177e-28b2-4186-81ad-531fed1d1ada" Apr 19 12:31:09.058743 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.058711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:31:09.058743 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.058725 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:09.059414 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:09.058850 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:31:09.059414 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:09.058976 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bjv2v" podUID="635b706c-4824-43f2-9e8f-fed36e897e9b" Apr 19 12:31:09.848242 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.848170 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-233.ec2.internal" event="NodeReady" Apr 19 12:31:09.848428 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.848292 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 19 12:31:09.889203 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.889170 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wslbj"] Apr 19 12:31:09.893557 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.893515 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r258l"] Apr 19 12:31:09.893714 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.893690 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:09.895924 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.895905 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-62vx7\"" Apr 19 12:31:09.896036 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.895926 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 19 12:31:09.896036 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.895957 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 19 12:31:09.896706 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.896680 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:09.898649 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.898631 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 19 12:31:09.898649 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.898660 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 19 12:31:09.898809 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.898716 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xbkh5\"" Apr 19 12:31:09.898809 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.898799 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 19 12:31:09.901360 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.901342 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wslbj"] Apr 19 12:31:09.904021 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:09.904002 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r258l"] Apr 19 12:31:10.016908 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.016876 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxh9\" (UniqueName: \"kubernetes.io/projected/05871dc3-ae4d-416d-b447-072b85515564-kube-api-access-zdxh9\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.017067 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.016942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:10.017067 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.016993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhsc\" (UniqueName: \"kubernetes.io/projected/126af27c-12ce-43aa-a67f-805e0e4b3a5a-kube-api-access-flhsc\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:10.017067 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.017029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.017199 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.017112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05871dc3-ae4d-416d-b447-072b85515564-config-volume\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.017199 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.017148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05871dc3-ae4d-416d-b447-072b85515564-tmp-dir\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.059165 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.059139 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:10.061751 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.061732 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:31:10.061751 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.061741 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4sh8l\"" Apr 19 12:31:10.061938 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.061732 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:31:10.118517 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.118426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.118517 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.118489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05871dc3-ae4d-416d-b447-072b85515564-config-volume\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.118517 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.118514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05871dc3-ae4d-416d-b447-072b85515564-tmp-dir\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.118754 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.118567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxh9\" (UniqueName: \"kubernetes.io/projected/05871dc3-ae4d-416d-b447-072b85515564-kube-api-access-zdxh9\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.118754 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.118592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:10.118754 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.118604 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:10.118754 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.118624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flhsc\" (UniqueName: \"kubernetes.io/projected/126af27c-12ce-43aa-a67f-805e0e4b3a5a-kube-api-access-flhsc\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:10.118754 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.118686 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls podName:05871dc3-ae4d-416d-b447-072b85515564 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:10.618663743 +0000 UTC m=+34.136821335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls") pod "dns-default-wslbj" (UID: "05871dc3-ae4d-416d-b447-072b85515564") : secret "dns-default-metrics-tls" not found Apr 19 12:31:10.119004 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.118967 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05871dc3-ae4d-416d-b447-072b85515564-tmp-dir\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.119004 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.118978 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:10.119113 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.119034 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert podName:126af27c-12ce-43aa-a67f-805e0e4b3a5a nodeName:}" failed. No retries permitted until 2026-04-19 12:31:10.619015885 +0000 UTC m=+34.137173475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert") pod "ingress-canary-r258l" (UID: "126af27c-12ce-43aa-a67f-805e0e4b3a5a") : secret "canary-serving-cert" not found Apr 19 12:31:10.119173 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.119154 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05871dc3-ae4d-416d-b447-072b85515564-config-volume\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.128791 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.128773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxh9\" (UniqueName: \"kubernetes.io/projected/05871dc3-ae4d-416d-b447-072b85515564-kube-api-access-zdxh9\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.128928 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.128910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhsc\" (UniqueName: \"kubernetes.io/projected/126af27c-12ce-43aa-a67f-805e0e4b3a5a-kube-api-access-flhsc\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:10.622155 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.622120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:10.622308 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.622168 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:10.622308 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.622271 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:10.622308 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.622272 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:10.622402 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.622334 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert podName:126af27c-12ce-43aa-a67f-805e0e4b3a5a nodeName:}" failed. No retries permitted until 2026-04-19 12:31:11.622315562 +0000 UTC m=+35.140473152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert") pod "ingress-canary-r258l" (UID: "126af27c-12ce-43aa-a67f-805e0e4b3a5a") : secret "canary-serving-cert" not found Apr 19 12:31:10.622402 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.622348 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls podName:05871dc3-ae4d-416d-b447-072b85515564 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:11.622342502 +0000 UTC m=+35.140500094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls") pod "dns-default-wslbj" (UID: "05871dc3-ae4d-416d-b447-072b85515564") : secret "dns-default-metrics-tls" not found Apr 19 12:31:10.722947 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.722916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:31:10.723122 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.723085 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:31:10.723185 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:10.723162 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:42.723142126 +0000 UTC m=+66.241299717 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:31:10.823586 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.823555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:10.826084 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.826057 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67mbw\" (UniqueName: \"kubernetes.io/projected/6189177e-28b2-4186-81ad-531fed1d1ada-kube-api-access-67mbw\") pod \"network-check-target-pq9ps\" (UID: \"6189177e-28b2-4186-81ad-531fed1d1ada\") " pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:10.968598 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:10.968571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:11.059726 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.059703 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:11.060165 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.059703 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:31:11.062616 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.062595 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 19 12:31:11.062616 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.062609 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nbhvx\"" Apr 19 12:31:11.063352 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.063336 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:31:11.233552 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.233436 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pq9ps"] Apr 19 12:31:11.236905 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:31:11.236876 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6189177e_28b2_4186_81ad_531fed1d1ada.slice/crio-97241b7eb8cddacf7ba404c36870d6a51e81de8aad414b07897e9517bc8575fd WatchSource:0}: Error finding container 97241b7eb8cddacf7ba404c36870d6a51e81de8aad414b07897e9517bc8575fd: Status 404 returned error can't find the container with id 97241b7eb8cddacf7ba404c36870d6a51e81de8aad414b07897e9517bc8575fd Apr 19 12:31:11.254231 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.254189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pq9ps" event={"ID":"6189177e-28b2-4186-81ad-531fed1d1ada","Type":"ContainerStarted","Data":"97241b7eb8cddacf7ba404c36870d6a51e81de8aad414b07897e9517bc8575fd"} Apr 19 12:31:11.629043 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.628977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:11.629043 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:11.629022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:11.629185 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:11.629103 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:11.629185 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:11.629131 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:11.629185 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:11.629168 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert podName:126af27c-12ce-43aa-a67f-805e0e4b3a5a nodeName:}" failed. No retries permitted until 2026-04-19 12:31:13.629152735 +0000 UTC m=+37.147310326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert") pod "ingress-canary-r258l" (UID: "126af27c-12ce-43aa-a67f-805e0e4b3a5a") : secret "canary-serving-cert" not found Apr 19 12:31:11.629185 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:11.629183 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls podName:05871dc3-ae4d-416d-b447-072b85515564 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:13.629177282 +0000 UTC m=+37.147334873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls") pod "dns-default-wslbj" (UID: "05871dc3-ae4d-416d-b447-072b85515564") : secret "dns-default-metrics-tls" not found Apr 19 12:31:12.259322 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:12.259291 2578 generic.go:358] "Generic (PLEG): container finished" podID="59d9fb8b-c8ff-4890-ae51-0f7fa04e6865" containerID="a838ff0249a48a4d89914edd455b7360f84c2d20ac0e9fbfd47dc7c6d4814b2d" exitCode=0 Apr 19 12:31:12.259804 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:12.259341 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerDied","Data":"a838ff0249a48a4d89914edd455b7360f84c2d20ac0e9fbfd47dc7c6d4814b2d"} Apr 19 12:31:13.264832 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:13.264792 2578 generic.go:358] "Generic (PLEG): container finished" podID="59d9fb8b-c8ff-4890-ae51-0f7fa04e6865" containerID="5b61ac297c0b96fdd8d4b2e3dce0ab56e7f95012fe26de9ee6e9b6837adeb59f" exitCode=0 Apr 19 12:31:13.265308 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:13.264836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerDied","Data":"5b61ac297c0b96fdd8d4b2e3dce0ab56e7f95012fe26de9ee6e9b6837adeb59f"} Apr 19 12:31:13.644150 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:13.644058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:13.644279 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:13.644155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:13.644279 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:13.644211 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:13.644279 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:13.644234 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:13.644279 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:13.644278 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert podName:126af27c-12ce-43aa-a67f-805e0e4b3a5a nodeName:}" failed. No retries permitted until 2026-04-19 12:31:17.644261388 +0000 UTC m=+41.162418979 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert") pod "ingress-canary-r258l" (UID: "126af27c-12ce-43aa-a67f-805e0e4b3a5a") : secret "canary-serving-cert" not found Apr 19 12:31:13.644410 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:13.644294 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls podName:05871dc3-ae4d-416d-b447-072b85515564 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:17.644285997 +0000 UTC m=+41.162443588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls") pod "dns-default-wslbj" (UID: "05871dc3-ae4d-416d-b447-072b85515564") : secret "dns-default-metrics-tls" not found Apr 19 12:31:14.270458 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:14.270427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" event={"ID":"59d9fb8b-c8ff-4890-ae51-0f7fa04e6865","Type":"ContainerStarted","Data":"179672131b7370eb1d58ff2b2978277922d219a1a5e1e4c5c7ed435f3974af26"} Apr 19 12:31:14.292595 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:14.292549 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mdpcd" podStartSLOduration=5.711875945 podStartE2EDuration="37.292536994s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:30:39.51864706 +0000 UTC m=+3.036804665" lastFinishedPulling="2026-04-19 12:31:11.099308113 +0000 UTC m=+34.617465714" observedRunningTime="2026-04-19 12:31:14.291095085 +0000 UTC m=+37.809252697" watchObservedRunningTime="2026-04-19 12:31:14.292536994 +0000 UTC m=+37.810694601" Apr 19 12:31:16.275383 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:16.275195 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pq9ps" event={"ID":"6189177e-28b2-4186-81ad-531fed1d1ada","Type":"ContainerStarted","Data":"1337e0b2a9a8eae2058bbf14775e45770140a279c4c939d23937bc1e46caebe0"} Apr 19 12:31:16.275383 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:16.275391 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:31:16.289557 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:16.289472 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pq9ps" podStartSLOduration=35.217621432 podStartE2EDuration="39.289455468s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:31:11.238723487 +0000 UTC m=+34.756881080" lastFinishedPulling="2026-04-19 12:31:15.310557525 +0000 UTC m=+38.828715116" observedRunningTime="2026-04-19 12:31:16.289145774 +0000 UTC m=+39.807303400" watchObservedRunningTime="2026-04-19 12:31:16.289455468 +0000 UTC m=+39.807613084" Apr 19 12:31:17.372652 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:17.372620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:17.376537 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:17.376508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/635b706c-4824-43f2-9e8f-fed36e897e9b-original-pull-secret\") pod \"global-pull-secret-syncer-bjv2v\" (UID: \"635b706c-4824-43f2-9e8f-fed36e897e9b\") " pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:17.670072 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:17.670047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bjv2v" Apr 19 12:31:17.674896 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:17.674873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:17.674996 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:17.674908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:17.674996 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:17.674993 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:17.675072 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:17.675033 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:17.675114 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:17.675041 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls podName:05871dc3-ae4d-416d-b447-072b85515564 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:25.67502857 +0000 UTC m=+49.193186161 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls") pod "dns-default-wslbj" (UID: "05871dc3-ae4d-416d-b447-072b85515564") : secret "dns-default-metrics-tls" not found Apr 19 12:31:17.675114 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:17.675111 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert podName:126af27c-12ce-43aa-a67f-805e0e4b3a5a nodeName:}" failed. No retries permitted until 2026-04-19 12:31:25.675093729 +0000 UTC m=+49.193251323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert") pod "ingress-canary-r258l" (UID: "126af27c-12ce-43aa-a67f-805e0e4b3a5a") : secret "canary-serving-cert" not found Apr 19 12:31:17.780274 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:17.780245 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bjv2v"] Apr 19 12:31:17.783503 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:31:17.783464 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635b706c_4824_43f2_9e8f_fed36e897e9b.slice/crio-ad4f0a713e65121b0fdf782ce258e116d4511a791af88da08f35933bed48ac77 WatchSource:0}: Error finding container ad4f0a713e65121b0fdf782ce258e116d4511a791af88da08f35933bed48ac77: Status 404 returned error can't find the container with id ad4f0a713e65121b0fdf782ce258e116d4511a791af88da08f35933bed48ac77 Apr 19 12:31:18.280295 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:18.280249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bjv2v" event={"ID":"635b706c-4824-43f2-9e8f-fed36e897e9b","Type":"ContainerStarted","Data":"ad4f0a713e65121b0fdf782ce258e116d4511a791af88da08f35933bed48ac77"} Apr 19 12:31:20.990252 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.990015 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh"] Apr 19 12:31:20.993404 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.993379 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7"] Apr 19 12:31:20.993581 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.993563 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:20.995902 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.995879 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 19 12:31:20.996021 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.995879 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 19 12:31:20.996021 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.995882 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 19 12:31:20.996439 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.996419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:20.996871 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.996751 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 19 12:31:20.998414 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.998394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 19 12:31:20.998579 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.998562 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 19 12:31:20.998757 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.998741 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 19 12:31:20.998816 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:20.998795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 19 12:31:21.005818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.005799 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh"] Apr 19 12:31:21.006527 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.006510 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7"] Apr 19 12:31:21.097795 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.097765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-hub\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.097795 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.097800 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.097989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.097851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r85fj\" (UniqueName: \"kubernetes.io/projected/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-kube-api-access-r85fj\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.097989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.097888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-ca\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.097989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.097909 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.097989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.097929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-tmp\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.097989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.097944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-klusterlet-config\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.097989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.097964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.098241 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.098017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d878\" (UniqueName: \"kubernetes.io/projected/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-kube-api-access-8d878\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.199129 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-ca\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.199129 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.199318 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-tmp\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.199318 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-klusterlet-config\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.199318 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.199318 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d878\" (UniqueName: \"kubernetes.io/projected/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-kube-api-access-8d878\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.199318 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199292 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-hub\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.199318 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.199720 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r85fj\" (UniqueName: \"kubernetes.io/projected/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-kube-api-access-r85fj\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.199720 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.199604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-tmp\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.200075 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.200045 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.201885 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.201864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-klusterlet-config\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.202303 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.202282 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-ca\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.202387 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.202309 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-hub\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.202456 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.202434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.202922 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.202905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.207730 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.207707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r85fj\" (UniqueName: \"kubernetes.io/projected/d1063c17-4535-485c-9bee-d8aa1bc8c1f8-kube-api-access-r85fj\") pod \"klusterlet-addon-workmgr-b9d56b4c9-wggxh\" (UID: \"d1063c17-4535-485c-9bee-d8aa1bc8c1f8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.207844 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.207820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d878\" (UniqueName: \"kubernetes.io/projected/98d6494b-7b6e-4310-ad2d-7f2821a90bc8-kube-api-access-8d878\") pod \"cluster-proxy-proxy-agent-869c6cf87f-txcs7\" (UID: \"98d6494b-7b6e-4310-ad2d-7f2821a90bc8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.308235 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.308086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:21.322907 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.322874 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:31:21.442965 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.442930 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh"] Apr 19 12:31:21.446659 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:31:21.446630 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1063c17_4535_485c_9bee_d8aa1bc8c1f8.slice/crio-13e3abf23ec3d946668e21052ed80c671fccd733237b6f25fec139ae75147087 WatchSource:0}: Error finding container 13e3abf23ec3d946668e21052ed80c671fccd733237b6f25fec139ae75147087: Status 404 returned error can't find the container with id 13e3abf23ec3d946668e21052ed80c671fccd733237b6f25fec139ae75147087 Apr 19 12:31:21.459861 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:21.459833 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7"] Apr 19 12:31:21.464559 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:31:21.464535 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98d6494b_7b6e_4310_ad2d_7f2821a90bc8.slice/crio-18f33e75dd40a3db19322fae39fce779100fbabdcbcaf754f755c913eabc5e41 WatchSource:0}: Error finding container 18f33e75dd40a3db19322fae39fce779100fbabdcbcaf754f755c913eabc5e41: Status 404 returned error can't find the container with id 18f33e75dd40a3db19322fae39fce779100fbabdcbcaf754f755c913eabc5e41 Apr 19 12:31:22.292283 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:22.292239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bjv2v" event={"ID":"635b706c-4824-43f2-9e8f-fed36e897e9b","Type":"ContainerStarted","Data":"4b89744a82c07bf7bead9b5ad37dd95d24abaa77cfff055a652627b269da32ae"} Apr 19 12:31:22.294101 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:22.294070 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" event={"ID":"d1063c17-4535-485c-9bee-d8aa1bc8c1f8","Type":"ContainerStarted","Data":"13e3abf23ec3d946668e21052ed80c671fccd733237b6f25fec139ae75147087"} Apr 19 12:31:22.296681 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:22.296633 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" event={"ID":"98d6494b-7b6e-4310-ad2d-7f2821a90bc8","Type":"ContainerStarted","Data":"18f33e75dd40a3db19322fae39fce779100fbabdcbcaf754f755c913eabc5e41"} Apr 19 12:31:22.308394 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:22.308346 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bjv2v" podStartSLOduration=33.822820938 podStartE2EDuration="37.308331175s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:31:17.785241291 +0000 UTC m=+41.303398883" lastFinishedPulling="2026-04-19 12:31:21.270751517 +0000 UTC m=+44.788909120" observedRunningTime="2026-04-19 12:31:22.307374476 +0000 UTC m=+45.825532091" watchObservedRunningTime="2026-04-19 12:31:22.308331175 +0000 UTC m=+45.826488791" Apr 19 12:31:25.734523 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:25.734492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:25.734934 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:25.734564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:25.734934 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:25.734636 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:25.734934 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:25.734668 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:25.734934 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:25.734709 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls podName:05871dc3-ae4d-416d-b447-072b85515564 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:41.734687887 +0000 UTC m=+65.252845478 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls") pod "dns-default-wslbj" (UID: "05871dc3-ae4d-416d-b447-072b85515564") : secret "dns-default-metrics-tls" not found Apr 19 12:31:25.734934 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:25.734724 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert podName:126af27c-12ce-43aa-a67f-805e0e4b3a5a nodeName:}" failed. No retries permitted until 2026-04-19 12:31:41.734717423 +0000 UTC m=+65.252875014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert") pod "ingress-canary-r258l" (UID: "126af27c-12ce-43aa-a67f-805e0e4b3a5a") : secret "canary-serving-cert" not found Apr 19 12:31:26.311625 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:26.311600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" event={"ID":"98d6494b-7b6e-4310-ad2d-7f2821a90bc8","Type":"ContainerStarted","Data":"310a76c60ab2a7c470c6ac4c336ef0b17c255db60041dcde36af3f3ab739931a"} Apr 19 12:31:27.314828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:27.314796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" event={"ID":"d1063c17-4535-485c-9bee-d8aa1bc8c1f8","Type":"ContainerStarted","Data":"37d7875854f5651664fe0fda1a18f45e03eb1367259a12680acbe1af56e8d645"} Apr 19 12:31:27.315256 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:27.315030 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:27.316601 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:27.316582 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:31:27.331530 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:27.331468 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" podStartSLOduration=2.57392664 podStartE2EDuration="7.331457739s" podCreationTimestamp="2026-04-19 12:31:20 +0000 UTC" firstStartedPulling="2026-04-19 12:31:21.448414509 +0000 UTC m=+44.966572104" lastFinishedPulling="2026-04-19 12:31:26.2059456 +0000 UTC m=+49.724103203" observedRunningTime="2026-04-19 12:31:27.331314361 +0000 UTC m=+50.849471973" watchObservedRunningTime="2026-04-19 12:31:27.331457739 +0000 UTC m=+50.849615355" Apr 19 12:31:35.252854 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:35.252826 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jrwbt" Apr 19 12:31:38.342058 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:38.342021 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" event={"ID":"98d6494b-7b6e-4310-ad2d-7f2821a90bc8","Type":"ContainerStarted","Data":"b15ee0e1a227cbd19dad06e959fd96e391b9e1c29c5df841081dc6df831e9b45"} Apr 19 12:31:38.342058 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:38.342059 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" event={"ID":"98d6494b-7b6e-4310-ad2d-7f2821a90bc8","Type":"ContainerStarted","Data":"77e13ab2fab3fae41b02882d72f63a07fc3a87edb13ee2aa6c7c755c92cdff3f"} Apr 19 12:31:38.358385 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:38.358225 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" podStartSLOduration=2.045976586 podStartE2EDuration="18.358212577s" podCreationTimestamp="2026-04-19 12:31:20 +0000 UTC" firstStartedPulling="2026-04-19 12:31:21.466276488 +0000 UTC m=+44.984434082" lastFinishedPulling="2026-04-19 12:31:37.778512472 +0000 UTC m=+61.296670073" observedRunningTime="2026-04-19 12:31:38.357506927 +0000 UTC m=+61.875664537" watchObservedRunningTime="2026-04-19 12:31:38.358212577 +0000 UTC m=+61.876370190" Apr 19 12:31:41.743868 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:41.743839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:31:41.744333 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:41.743907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:31:41.744333 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:41.743991 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:41.744333 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:41.744004 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:41.744333 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:41.744059 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert podName:126af27c-12ce-43aa-a67f-805e0e4b3a5a nodeName:}" failed. No retries permitted until 2026-04-19 12:32:13.744040653 +0000 UTC m=+97.262198245 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert") pod "ingress-canary-r258l" (UID: "126af27c-12ce-43aa-a67f-805e0e4b3a5a") : secret "canary-serving-cert" not found Apr 19 12:31:41.744333 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:41.744082 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls podName:05871dc3-ae4d-416d-b447-072b85515564 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:13.744074037 +0000 UTC m=+97.262231631 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls") pod "dns-default-wslbj" (UID: "05871dc3-ae4d-416d-b447-072b85515564") : secret "dns-default-metrics-tls" not found Apr 19 12:31:42.751972 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:42.751940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:31:42.754319 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:42.754298 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:31:42.762398 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:42.762380 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:31:42.762460 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:31:42.762438 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:46.762423104 +0000 UTC m=+130.280580696 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : secret "metrics-daemon-secret" not found Apr 19 12:31:47.279925 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:31:47.279891 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pq9ps" Apr 19 12:32:13.758108 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:32:13.757954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:32:13.758108 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:32:13.758002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:32:13.758108 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:32:13.758108 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:32:13.758675 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:32:13.758120 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:32:13.758675 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:32:13.758172 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls podName:05871dc3-ae4d-416d-b447-072b85515564 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:17.75815758 +0000 UTC m=+161.276315171 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls") pod "dns-default-wslbj" (UID: "05871dc3-ae4d-416d-b447-072b85515564") : secret "dns-default-metrics-tls" not found Apr 19 12:32:13.758675 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:32:13.758199 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert podName:126af27c-12ce-43aa-a67f-805e0e4b3a5a nodeName:}" failed. No retries permitted until 2026-04-19 12:33:17.758178994 +0000 UTC m=+161.276336591 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert") pod "ingress-canary-r258l" (UID: "126af27c-12ce-43aa-a67f-805e0e4b3a5a") : secret "canary-serving-cert" not found Apr 19 12:32:46.772931 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:32:46.772893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:32:46.773399 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:32:46.773040 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:32:46.773399 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:32:46.773112 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs podName:46c7636d-9cd5-47c0-afaa-e58b27072e37 nodeName:}" failed. No retries permitted until 2026-04-19 12:34:48.773097041 +0000 UTC m=+252.291254631 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs") pod "network-metrics-daemon-7t9j2" (UID: "46c7636d-9cd5-47c0-afaa-e58b27072e37") : secret "metrics-daemon-secret" not found Apr 19 12:32:59.205749 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:32:59.205716 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wpnz9_f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0/dns-node-resolver/0.log" Apr 19 12:32:59.804553 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:32:59.804524 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fqzjh_5511d94c-29bb-45a0-b060-745261d9a2e8/node-ca/0.log" Apr 19 12:33:11.324374 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:11.324317 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" podUID="98d6494b-7b6e-4310-ad2d-7f2821a90bc8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 12:33:12.906164 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:33:12.906128 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wslbj" podUID="05871dc3-ae4d-416d-b447-072b85515564" Apr 19 12:33:12.912265 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:33:12.912247 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-r258l" podUID="126af27c-12ce-43aa-a67f-805e0e4b3a5a" Apr 19 12:33:13.557009 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:13.556976 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:33:13.557184 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:13.556980 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wslbj" Apr 19 12:33:14.075968 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:33:14.075938 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7t9j2" podUID="46c7636d-9cd5-47c0-afaa-e58b27072e37" Apr 19 12:33:17.775249 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:17.775213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:33:17.775813 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:17.775256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:33:17.777616 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:17.777594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05871dc3-ae4d-416d-b447-072b85515564-metrics-tls\") pod \"dns-default-wslbj\" (UID: \"05871dc3-ae4d-416d-b447-072b85515564\") " pod="openshift-dns/dns-default-wslbj" Apr 19 12:33:17.777720 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:17.777689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/126af27c-12ce-43aa-a67f-805e0e4b3a5a-cert\") pod \"ingress-canary-r258l\" (UID: \"126af27c-12ce-43aa-a67f-805e0e4b3a5a\") " pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:33:18.060506 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:18.060408 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-62vx7\"" Apr 19 12:33:18.060506 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:18.060412 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xbkh5\"" Apr 19 12:33:18.068380 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:18.068361 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wslbj" Apr 19 12:33:18.068464 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:18.068447 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r258l" Apr 19 12:33:18.199882 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:18.199857 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r258l"] Apr 19 12:33:18.203623 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:33:18.203597 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod126af27c_12ce_43aa_a67f_805e0e4b3a5a.slice/crio-e3c0a40133a0818596c3c4144c2b801a4a8def049b3fca83d77aed0657c476d2 WatchSource:0}: Error finding container e3c0a40133a0818596c3c4144c2b801a4a8def049b3fca83d77aed0657c476d2: Status 404 returned error can't find the container with id e3c0a40133a0818596c3c4144c2b801a4a8def049b3fca83d77aed0657c476d2 Apr 19 12:33:18.214030 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:18.214010 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wslbj"] Apr 19 12:33:18.216450 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:33:18.216427 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05871dc3_ae4d_416d_b447_072b85515564.slice/crio-a2a97cf8e1b73dc8166ebe22520d30e41b0568b6f9cee9e30a07623275d01ce6 WatchSource:0}: Error finding container a2a97cf8e1b73dc8166ebe22520d30e41b0568b6f9cee9e30a07623275d01ce6: Status 404 returned error can't find the container with id a2a97cf8e1b73dc8166ebe22520d30e41b0568b6f9cee9e30a07623275d01ce6 Apr 19 12:33:18.569496 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:18.569453 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r258l" event={"ID":"126af27c-12ce-43aa-a67f-805e0e4b3a5a","Type":"ContainerStarted","Data":"e3c0a40133a0818596c3c4144c2b801a4a8def049b3fca83d77aed0657c476d2"} Apr 19 12:33:18.570370 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:18.570346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wslbj" event={"ID":"05871dc3-ae4d-416d-b447-072b85515564","Type":"ContainerStarted","Data":"a2a97cf8e1b73dc8166ebe22520d30e41b0568b6f9cee9e30a07623275d01ce6"} Apr 19 12:33:20.576582 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:20.576546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wslbj" event={"ID":"05871dc3-ae4d-416d-b447-072b85515564","Type":"ContainerStarted","Data":"c660d84a355fc1bf0510638b7ccef849f485a4308c537ec108407dc4e37f2a5f"} Apr 19 12:33:20.576582 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:20.576581 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wslbj" event={"ID":"05871dc3-ae4d-416d-b447-072b85515564","Type":"ContainerStarted","Data":"f72b5852ac13149ee61eb3691c806462c20790d80e4e7d93678948fec7e58f72"} Apr 19 12:33:20.577063 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:20.576672 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wslbj" Apr 19 12:33:20.577782 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:20.577762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r258l" event={"ID":"126af27c-12ce-43aa-a67f-805e0e4b3a5a","Type":"ContainerStarted","Data":"75b3772a8b00d7287c83e11b3c10544dc74d2624d228ded2c50cc784bc5f6bb6"} Apr 19 12:33:20.592122 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:20.592083 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wslbj" podStartSLOduration=129.80414515 podStartE2EDuration="2m11.592070725s" podCreationTimestamp="2026-04-19 12:31:09 +0000 UTC" firstStartedPulling="2026-04-19 12:33:18.218095114 +0000 UTC m=+161.736252709" lastFinishedPulling="2026-04-19 12:33:20.006020496 +0000 UTC m=+163.524178284" observedRunningTime="2026-04-19 12:33:20.591191687 +0000 UTC m=+164.109349300" watchObservedRunningTime="2026-04-19 12:33:20.592070725 +0000 UTC m=+164.110228342" Apr 19 12:33:20.604712 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:20.604676 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r258l" podStartSLOduration=129.799068005 podStartE2EDuration="2m11.604665748s" podCreationTimestamp="2026-04-19 12:31:09 +0000 UTC" firstStartedPulling="2026-04-19 12:33:18.20529871 +0000 UTC m=+161.723456301" lastFinishedPulling="2026-04-19 12:33:20.010896453 +0000 UTC m=+163.529054044" observedRunningTime="2026-04-19 12:33:20.604290732 +0000 UTC m=+164.122448373" watchObservedRunningTime="2026-04-19 12:33:20.604665748 +0000 UTC m=+164.122823360" Apr 19 12:33:21.324683 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.324650 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" podUID="98d6494b-7b6e-4310-ad2d-7f2821a90bc8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 12:33:21.512556 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.512525 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c9phw"] Apr 19 12:33:21.515722 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.515702 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.519109 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.519081 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 19 12:33:21.519229 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.519081 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8lgzb\"" Apr 19 12:33:21.519229 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.519149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 19 12:33:21.519229 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.519149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 19 12:33:21.519383 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.519149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 19 12:33:21.531432 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.531410 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c9phw"] Apr 19 12:33:21.544853 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.544830 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76856dbc87-mlxnt"] Apr 19 12:33:21.548790 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.548774 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.551135 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.551112 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 19 12:33:21.551135 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.551132 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v66mr\"" Apr 19 12:33:21.551306 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.551143 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 19 12:33:21.551306 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.551236 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 19 12:33:21.556559 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.556541 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 19 12:33:21.563410 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.563390 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76856dbc87-mlxnt"] Apr 19 12:33:21.602337 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.602275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8083e678-d295-4963-917f-b040594707dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.602337 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.602305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58lx8\" (UniqueName: \"kubernetes.io/projected/8083e678-d295-4963-917f-b040594707dd-kube-api-access-58lx8\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.602337 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.602325 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8083e678-d295-4963-917f-b040594707dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.602716 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.602343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8083e678-d295-4963-917f-b040594707dd-data-volume\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.602716 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.602443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8083e678-d295-4963-917f-b040594707dd-crio-socket\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.703645 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-trusted-ca\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.703726 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-registry-tls\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.703726 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8083e678-d295-4963-917f-b040594707dd-data-volume\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.703726 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-image-registry-private-configuration\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.703839 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-installation-pull-secrets\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.703839 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8083e678-d295-4963-917f-b040594707dd-crio-socket\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.703839 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-ca-trust-extracted\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.703839 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8083e678-d295-4963-917f-b040594707dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.703839 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t72qp\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-kube-api-access-t72qp\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.704076 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703846 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8083e678-d295-4963-917f-b040594707dd-crio-socket\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.704076 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-bound-sa-token\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.704076 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8083e678-d295-4963-917f-b040594707dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.704076 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.703998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-registry-certificates\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.704076 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.704004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8083e678-d295-4963-917f-b040594707dd-data-volume\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.704076 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.704027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58lx8\" (UniqueName: \"kubernetes.io/projected/8083e678-d295-4963-917f-b040594707dd-kube-api-access-58lx8\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.704657 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.704638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8083e678-d295-4963-917f-b040594707dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.706179 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.706157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8083e678-d295-4963-917f-b040594707dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.720493 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.720452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58lx8\" (UniqueName: \"kubernetes.io/projected/8083e678-d295-4963-917f-b040594707dd-kube-api-access-58lx8\") pod \"insights-runtime-extractor-c9phw\" (UID: \"8083e678-d295-4963-917f-b040594707dd\") " pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.805088 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805064 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-registry-certificates\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.805237 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805092 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-trusted-ca\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.805237 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-registry-tls\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.805237 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-image-registry-private-configuration\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.805237 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805146 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-installation-pull-secrets\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.805507 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-ca-trust-extracted\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.805507 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t72qp\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-kube-api-access-t72qp\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.805507 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-bound-sa-token\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.805723 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.805703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-ca-trust-extracted\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.806192 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.806168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-registry-certificates\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.806919 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.806897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-trusted-ca\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.808145 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.808121 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-installation-pull-secrets\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.808212 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.808170 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-registry-tls\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.808318 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.808297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-image-registry-private-configuration\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.815409 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.815386 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t72qp\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-kube-api-access-t72qp\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.815529 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.815509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f2405a0-05f1-4033-8bc1-a53d6643aa6c-bound-sa-token\") pod \"image-registry-76856dbc87-mlxnt\" (UID: \"1f2405a0-05f1-4033-8bc1-a53d6643aa6c\") " pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.825893 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.825873 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c9phw" Apr 19 12:33:21.858338 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.858278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:21.948008 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.947979 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c9phw"] Apr 19 12:33:21.951116 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:33:21.951086 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8083e678_d295_4963_917f_b040594707dd.slice/crio-3be0ffd045b8fd8f5d33838375f8afe8393ce95459d17fd391dc6898f8d841b2 WatchSource:0}: Error finding container 3be0ffd045b8fd8f5d33838375f8afe8393ce95459d17fd391dc6898f8d841b2: Status 404 returned error can't find the container with id 3be0ffd045b8fd8f5d33838375f8afe8393ce95459d17fd391dc6898f8d841b2 Apr 19 12:33:21.990389 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:21.990348 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76856dbc87-mlxnt"] Apr 19 12:33:21.992513 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:33:21.992450 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2405a0_05f1_4033_8bc1_a53d6643aa6c.slice/crio-29a6b27c012bf5d372b7d8fee8d8449263c2a461f81cd7e7a6fc59849f0aba6f WatchSource:0}: Error finding container 29a6b27c012bf5d372b7d8fee8d8449263c2a461f81cd7e7a6fc59849f0aba6f: Status 404 returned error can't find the container with id 29a6b27c012bf5d372b7d8fee8d8449263c2a461f81cd7e7a6fc59849f0aba6f Apr 19 12:33:22.584552 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:22.584509 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9phw" event={"ID":"8083e678-d295-4963-917f-b040594707dd","Type":"ContainerStarted","Data":"b37b30e1203b17f00ee6c3d2fea25b7b3641bf947cea4e7e43beaf547894325d"} Apr 19 12:33:22.584725 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:22.584563 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9phw" event={"ID":"8083e678-d295-4963-917f-b040594707dd","Type":"ContainerStarted","Data":"3be0ffd045b8fd8f5d33838375f8afe8393ce95459d17fd391dc6898f8d841b2"} Apr 19 12:33:22.585819 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:22.585792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" event={"ID":"1f2405a0-05f1-4033-8bc1-a53d6643aa6c","Type":"ContainerStarted","Data":"6a08a55aa07d65c4cc91d7c95eb291ab6ba3a89633dd058ba82007a778c01f96"} Apr 19 12:33:22.585933 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:22.585827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" event={"ID":"1f2405a0-05f1-4033-8bc1-a53d6643aa6c","Type":"ContainerStarted","Data":"29a6b27c012bf5d372b7d8fee8d8449263c2a461f81cd7e7a6fc59849f0aba6f"} Apr 19 12:33:22.585975 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:22.585934 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:22.606088 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:22.606049 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" podStartSLOduration=1.606033888 podStartE2EDuration="1.606033888s" podCreationTimestamp="2026-04-19 12:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:33:22.604713195 +0000 UTC m=+166.122870809" watchObservedRunningTime="2026-04-19 12:33:22.606033888 +0000 UTC m=+166.124191493" Apr 19 12:33:23.591466 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:23.591427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9phw" event={"ID":"8083e678-d295-4963-917f-b040594707dd","Type":"ContainerStarted","Data":"7f2bfc12d7a9f2a7d0d631eb0a1c31ec0a31bf8d23e16d6c126e22f0033b6d2b"} Apr 19 12:33:24.595677 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:24.595604 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9phw" event={"ID":"8083e678-d295-4963-917f-b040594707dd","Type":"ContainerStarted","Data":"59121016da8cf479cee50f7105a3efff29d781a7bde60efe0308bffae7aa8032"} Apr 19 12:33:24.613322 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:24.613280 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c9phw" podStartSLOduration=1.3434302759999999 podStartE2EDuration="3.613267674s" podCreationTimestamp="2026-04-19 12:33:21 +0000 UTC" firstStartedPulling="2026-04-19 12:33:22.009282111 +0000 UTC m=+165.527439716" lastFinishedPulling="2026-04-19 12:33:24.27911952 +0000 UTC m=+167.797277114" observedRunningTime="2026-04-19 12:33:24.612166728 +0000 UTC m=+168.130324341" watchObservedRunningTime="2026-04-19 12:33:24.613267674 +0000 UTC m=+168.131425286" Apr 19 12:33:26.602278 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:26.602249 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1063c17-4535-485c-9bee-d8aa1bc8c1f8" containerID="37d7875854f5651664fe0fda1a18f45e03eb1367259a12680acbe1af56e8d645" exitCode=1 Apr 19 12:33:26.602612 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:26.602311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" event={"ID":"d1063c17-4535-485c-9bee-d8aa1bc8c1f8","Type":"ContainerDied","Data":"37d7875854f5651664fe0fda1a18f45e03eb1367259a12680acbe1af56e8d645"} Apr 19 12:33:26.602612 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:26.602604 2578 scope.go:117] "RemoveContainer" containerID="37d7875854f5651664fe0fda1a18f45e03eb1367259a12680acbe1af56e8d645" Apr 19 12:33:27.060124 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:27.060093 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:33:27.316043 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:27.315968 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:33:27.606308 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:27.606237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" event={"ID":"d1063c17-4535-485c-9bee-d8aa1bc8c1f8","Type":"ContainerStarted","Data":"160871fb645226ef37e2d3dad3b63c44aaa3991a7f500dac861bccd52b8f61ec"} Apr 19 12:33:27.606660 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:27.606424 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:33:27.607031 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:27.607000 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b9d56b4c9-wggxh" Apr 19 12:33:30.581885 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:30.581857 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wslbj" Apr 19 12:33:31.324501 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:31.324448 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" podUID="98d6494b-7b6e-4310-ad2d-7f2821a90bc8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 12:33:31.324658 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:31.324542 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" Apr 19 12:33:31.324982 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:31.324952 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"b15ee0e1a227cbd19dad06e959fd96e391b9e1c29c5df841081dc6df831e9b45"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 19 12:33:31.325018 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:31.325003 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" podUID="98d6494b-7b6e-4310-ad2d-7f2821a90bc8" containerName="service-proxy" containerID="cri-o://b15ee0e1a227cbd19dad06e959fd96e391b9e1c29c5df841081dc6df831e9b45" gracePeriod=30 Apr 19 12:33:31.617816 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:31.617748 2578 generic.go:358] "Generic (PLEG): container finished" podID="98d6494b-7b6e-4310-ad2d-7f2821a90bc8" containerID="b15ee0e1a227cbd19dad06e959fd96e391b9e1c29c5df841081dc6df831e9b45" exitCode=2 Apr 19 12:33:31.617816 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:31.617801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" event={"ID":"98d6494b-7b6e-4310-ad2d-7f2821a90bc8","Type":"ContainerDied","Data":"b15ee0e1a227cbd19dad06e959fd96e391b9e1c29c5df841081dc6df831e9b45"} Apr 19 12:33:31.618147 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:31.617827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-869c6cf87f-txcs7" event={"ID":"98d6494b-7b6e-4310-ad2d-7f2821a90bc8","Type":"ContainerStarted","Data":"c7e93eae40918b8411feffcaaf87e58e89eb3eff24514474debfe2455139e823"} Apr 19 12:33:41.597610 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.597572 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zn87z"] Apr 19 12:33:41.602645 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.602623 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.604918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.604893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 19 12:33:41.604918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.604903 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 19 12:33:41.605122 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.605015 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 19 12:33:41.605122 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.605082 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 19 12:33:41.605260 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.605243 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rhpt4\"" Apr 19 12:33:41.605929 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.605914 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 19 12:33:41.605986 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.605932 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 19 12:33:41.638958 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.638934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.639055 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.638969 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-sys\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.639055 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.638994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-wtmp\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.639128 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.639066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/250f49a4-ff50-4180-85c9-c0a23c798518-metrics-client-ca\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.639128 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.639096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-root\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.639128 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.639121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-tls\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.639254 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.639142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-accelerators-collector-config\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.639254 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.639216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-textfile\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.639254 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.639247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klhjd\" (UniqueName: \"kubernetes.io/projected/250f49a4-ff50-4180-85c9-c0a23c798518-kube-api-access-klhjd\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.739891 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.739862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740001 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.739902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-sys\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740001 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.739919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-wtmp\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740001 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.739986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-sys\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740175 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-wtmp\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740175 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/250f49a4-ff50-4180-85c9-c0a23c798518-metrics-client-ca\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740175 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-root\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740175 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-tls\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740175 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-accelerators-collector-config\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740175 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/250f49a4-ff50-4180-85c9-c0a23c798518-root\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740175 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740138 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-textfile\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740175 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klhjd\" (UniqueName: \"kubernetes.io/projected/250f49a4-ff50-4180-85c9-c0a23c798518-kube-api-access-klhjd\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740513 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:33:41.740192 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 19 12:33:41.740513 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:33:41.740249 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-tls podName:250f49a4-ff50-4180-85c9-c0a23c798518 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:42.240229587 +0000 UTC m=+185.758387180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-tls") pod "node-exporter-zn87z" (UID: "250f49a4-ff50-4180-85c9-c0a23c798518") : secret "node-exporter-tls" not found Apr 19 12:33:41.740622 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740601 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-textfile\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740658 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/250f49a4-ff50-4180-85c9-c0a23c798518-metrics-client-ca\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.740658 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.740651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-accelerators-collector-config\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.742302 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.742281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:41.748639 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:41.748620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klhjd\" (UniqueName: \"kubernetes.io/projected/250f49a4-ff50-4180-85c9-c0a23c798518-kube-api-access-klhjd\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:42.244161 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:42.244132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-tls\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:42.246469 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:42.246451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/250f49a4-ff50-4180-85c9-c0a23c798518-node-exporter-tls\") pod \"node-exporter-zn87z\" (UID: \"250f49a4-ff50-4180-85c9-c0a23c798518\") " pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:42.511716 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:42.511647 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zn87z" Apr 19 12:33:42.519689 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:33:42.519658 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250f49a4_ff50_4180_85c9_c0a23c798518.slice/crio-e282df8aa261b201ab9e2613f2ab036bb4a44484556a9a9ed5c9a7a74bbb786e WatchSource:0}: Error finding container e282df8aa261b201ab9e2613f2ab036bb4a44484556a9a9ed5c9a7a74bbb786e: Status 404 returned error can't find the container with id e282df8aa261b201ab9e2613f2ab036bb4a44484556a9a9ed5c9a7a74bbb786e Apr 19 12:33:42.646776 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:42.646750 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zn87z" event={"ID":"250f49a4-ff50-4180-85c9-c0a23c798518","Type":"ContainerStarted","Data":"e282df8aa261b201ab9e2613f2ab036bb4a44484556a9a9ed5c9a7a74bbb786e"} Apr 19 12:33:43.595913 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:43.595889 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76856dbc87-mlxnt" Apr 19 12:33:43.650986 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:43.650955 2578 generic.go:358] "Generic (PLEG): container finished" podID="250f49a4-ff50-4180-85c9-c0a23c798518" containerID="e48d8cd6c87e9de2f0c1afb83713e0fd83f3535165b7ac7d5968dda93782c204" exitCode=0 Apr 19 12:33:43.651374 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:43.651038 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zn87z" event={"ID":"250f49a4-ff50-4180-85c9-c0a23c798518","Type":"ContainerDied","Data":"e48d8cd6c87e9de2f0c1afb83713e0fd83f3535165b7ac7d5968dda93782c204"} Apr 19 12:33:44.657663 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:44.657624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zn87z" event={"ID":"250f49a4-ff50-4180-85c9-c0a23c798518","Type":"ContainerStarted","Data":"f10aefb21b4bd91008d8f589ded98aae9f60d454f74bbdf47b41bcbf5faa5bce"} Apr 19 12:33:44.658019 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:44.657669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zn87z" event={"ID":"250f49a4-ff50-4180-85c9-c0a23c798518","Type":"ContainerStarted","Data":"8d152aa42ab008fdc10ff37484762356346cb37eb23df33b43e3dcf1a47a78fe"} Apr 19 12:33:44.675108 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:44.675044 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zn87z" podStartSLOduration=2.728756166 podStartE2EDuration="3.675031658s" podCreationTimestamp="2026-04-19 12:33:41 +0000 UTC" firstStartedPulling="2026-04-19 12:33:42.521467883 +0000 UTC m=+186.039625478" lastFinishedPulling="2026-04-19 12:33:43.467743371 +0000 UTC m=+186.985900970" observedRunningTime="2026-04-19 12:33:44.674014898 +0000 UTC m=+188.192172510" watchObservedRunningTime="2026-04-19 12:33:44.675031658 +0000 UTC m=+188.193189249" Apr 19 12:33:46.158985 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.158952 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6754c8fdfb-2h4xs"] Apr 19 12:33:46.161990 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.161974 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.164231 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.164201 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 19 12:33:46.164371 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.164260 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 19 12:33:46.164371 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.164294 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 19 12:33:46.164531 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.164515 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-8zfkz\"" Apr 19 12:33:46.165179 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.165163 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-761ihplba9aoj\"" Apr 19 12:33:46.165269 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.165253 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 19 12:33:46.170373 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.170355 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6754c8fdfb-2h4xs"] Apr 19 12:33:46.173838 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.173818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873ac349-1422-440d-a10c-599af83ba311-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.173906 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.173859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-client-ca-bundle\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.173954 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.173914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/873ac349-1422-440d-a10c-599af83ba311-metrics-server-audit-profiles\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.173954 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.173939 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-secret-metrics-server-tls\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.174017 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.173972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-secret-metrics-server-client-certs\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.174017 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.173992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcnsg\" (UniqueName: \"kubernetes.io/projected/873ac349-1422-440d-a10c-599af83ba311-kube-api-access-gcnsg\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.174017 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.174008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/873ac349-1422-440d-a10c-599af83ba311-audit-log\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.274249 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.274224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873ac349-1422-440d-a10c-599af83ba311-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.274363 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.274263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-client-ca-bundle\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.274363 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.274301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/873ac349-1422-440d-a10c-599af83ba311-metrics-server-audit-profiles\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.274363 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.274341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-secret-metrics-server-tls\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.274515 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.274376 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-secret-metrics-server-client-certs\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.274515 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.274405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcnsg\" (UniqueName: \"kubernetes.io/projected/873ac349-1422-440d-a10c-599af83ba311-kube-api-access-gcnsg\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.274515 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.274429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/873ac349-1422-440d-a10c-599af83ba311-audit-log\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.274989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.274966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873ac349-1422-440d-a10c-599af83ba311-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.275097 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.275077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/873ac349-1422-440d-a10c-599af83ba311-audit-log\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.275863 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.275841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/873ac349-1422-440d-a10c-599af83ba311-metrics-server-audit-profiles\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.277167 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.277139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-secret-metrics-server-tls\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.277167 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.277161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-client-ca-bundle\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.277265 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.277196 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/873ac349-1422-440d-a10c-599af83ba311-secret-metrics-server-client-certs\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.281407 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.281391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcnsg\" (UniqueName: \"kubernetes.io/projected/873ac349-1422-440d-a10c-599af83ba311-kube-api-access-gcnsg\") pod \"metrics-server-6754c8fdfb-2h4xs\" (UID: \"873ac349-1422-440d-a10c-599af83ba311\") " pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.471124 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.471104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:33:46.592851 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.592825 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6754c8fdfb-2h4xs"] Apr 19 12:33:46.596142 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:33:46.596099 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod873ac349_1422_440d_a10c_599af83ba311.slice/crio-d3aa818406361e3e0f67f48a086e292d646c7f5d27846f2143554e2d7da8a6c0 WatchSource:0}: Error finding container d3aa818406361e3e0f67f48a086e292d646c7f5d27846f2143554e2d7da8a6c0: Status 404 returned error can't find the container with id d3aa818406361e3e0f67f48a086e292d646c7f5d27846f2143554e2d7da8a6c0 Apr 19 12:33:46.663713 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:46.663685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" event={"ID":"873ac349-1422-440d-a10c-599af83ba311","Type":"ContainerStarted","Data":"d3aa818406361e3e0f67f48a086e292d646c7f5d27846f2143554e2d7da8a6c0"} Apr 19 12:33:48.670550 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:48.670514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" event={"ID":"873ac349-1422-440d-a10c-599af83ba311","Type":"ContainerStarted","Data":"185443aae8de49f705758956027657bf13db1aeddb2ec80408b286718d2322f2"} Apr 19 12:33:48.686722 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:33:48.686680 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" podStartSLOduration=1.200249683 podStartE2EDuration="2.686667885s" podCreationTimestamp="2026-04-19 12:33:46 +0000 UTC" firstStartedPulling="2026-04-19 12:33:46.598198326 +0000 UTC m=+190.116355917" lastFinishedPulling="2026-04-19 12:33:48.084616518 +0000 UTC m=+191.602774119" observedRunningTime="2026-04-19 12:33:48.685518455 +0000 UTC m=+192.203676065" watchObservedRunningTime="2026-04-19 12:33:48.686667885 +0000 UTC m=+192.204825546" Apr 19 12:34:06.472053 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:06.472016 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:34:06.472053 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:06.472057 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:34:26.477574 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:26.477533 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:34:26.481490 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:26.481455 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6754c8fdfb-2h4xs" Apr 19 12:34:48.802688 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:48.802649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:34:48.805110 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:48.805085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46c7636d-9cd5-47c0-afaa-e58b27072e37-metrics-certs\") pod \"network-metrics-daemon-7t9j2\" (UID: \"46c7636d-9cd5-47c0-afaa-e58b27072e37\") " pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:34:48.963210 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:48.963183 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nbhvx\"" Apr 19 12:34:48.971063 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:48.971047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t9j2" Apr 19 12:34:49.091532 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:49.091452 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7t9j2"] Apr 19 12:34:49.094726 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:34:49.094691 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46c7636d_9cd5_47c0_afaa_e58b27072e37.slice/crio-3fbf1899bf3f0997a44394adad3cfcfe722accb1529b5427823fa3b1bc583011 WatchSource:0}: Error finding container 3fbf1899bf3f0997a44394adad3cfcfe722accb1529b5427823fa3b1bc583011: Status 404 returned error can't find the container with id 3fbf1899bf3f0997a44394adad3cfcfe722accb1529b5427823fa3b1bc583011 Apr 19 12:34:49.826717 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:49.826676 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7t9j2" event={"ID":"46c7636d-9cd5-47c0-afaa-e58b27072e37","Type":"ContainerStarted","Data":"3fbf1899bf3f0997a44394adad3cfcfe722accb1529b5427823fa3b1bc583011"} Apr 19 12:34:50.831547 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:50.831512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7t9j2" event={"ID":"46c7636d-9cd5-47c0-afaa-e58b27072e37","Type":"ContainerStarted","Data":"619b523797e7a49c148994d9cb9485fd455d9eac5bd975283593adc0191f230e"} Apr 19 12:34:50.831547 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:50.831545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7t9j2" event={"ID":"46c7636d-9cd5-47c0-afaa-e58b27072e37","Type":"ContainerStarted","Data":"81d87471fd57270003c3609d17076752371b257c32bc77b70f12b5834a4e4300"} Apr 19 12:34:50.846077 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:34:50.846033 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7t9j2" podStartSLOduration=252.927341505 podStartE2EDuration="4m13.846019894s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:34:49.09677816 +0000 UTC m=+252.614935752" lastFinishedPulling="2026-04-19 12:34:50.015456548 +0000 UTC m=+253.533614141" observedRunningTime="2026-04-19 12:34:50.845467837 +0000 UTC m=+254.363625451" watchObservedRunningTime="2026-04-19 12:34:50.846019894 +0000 UTC m=+254.364177508" Apr 19 12:35:36.971705 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:35:36.971680 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:35:36.972215 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:35:36.971718 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:35:36.977322 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:35:36.977301 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 19 12:38:03.047410 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.047336 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-ljkqx"] Apr 19 12:38:03.050344 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.050327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-ljkqx" Apr 19 12:38:03.052542 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.052521 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 19 12:38:03.053371 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.053353 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-5fnfg\"" Apr 19 12:38:03.053420 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.053379 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 19 12:38:03.058520 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.058495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-ljkqx"] Apr 19 12:38:03.167143 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.167114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36f68a4b-9bad-4638-897d-1244c4eef64d-bound-sa-token\") pod \"cert-manager-759f64656b-ljkqx\" (UID: \"36f68a4b-9bad-4638-897d-1244c4eef64d\") " pod="cert-manager/cert-manager-759f64656b-ljkqx" Apr 19 12:38:03.167253 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.167154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspxr\" (UniqueName: \"kubernetes.io/projected/36f68a4b-9bad-4638-897d-1244c4eef64d-kube-api-access-gspxr\") pod \"cert-manager-759f64656b-ljkqx\" (UID: \"36f68a4b-9bad-4638-897d-1244c4eef64d\") " pod="cert-manager/cert-manager-759f64656b-ljkqx" Apr 19 12:38:03.268502 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.268453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gspxr\" (UniqueName: \"kubernetes.io/projected/36f68a4b-9bad-4638-897d-1244c4eef64d-kube-api-access-gspxr\") pod \"cert-manager-759f64656b-ljkqx\" (UID: \"36f68a4b-9bad-4638-897d-1244c4eef64d\") " pod="cert-manager/cert-manager-759f64656b-ljkqx" Apr 19 12:38:03.268655 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.268557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36f68a4b-9bad-4638-897d-1244c4eef64d-bound-sa-token\") pod \"cert-manager-759f64656b-ljkqx\" (UID: \"36f68a4b-9bad-4638-897d-1244c4eef64d\") " pod="cert-manager/cert-manager-759f64656b-ljkqx" Apr 19 12:38:03.277364 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.277333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36f68a4b-9bad-4638-897d-1244c4eef64d-bound-sa-token\") pod \"cert-manager-759f64656b-ljkqx\" (UID: \"36f68a4b-9bad-4638-897d-1244c4eef64d\") " pod="cert-manager/cert-manager-759f64656b-ljkqx" Apr 19 12:38:03.277609 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.277591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspxr\" (UniqueName: \"kubernetes.io/projected/36f68a4b-9bad-4638-897d-1244c4eef64d-kube-api-access-gspxr\") pod \"cert-manager-759f64656b-ljkqx\" (UID: \"36f68a4b-9bad-4638-897d-1244c4eef64d\") " pod="cert-manager/cert-manager-759f64656b-ljkqx" Apr 19 12:38:03.360009 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.359956 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-ljkqx" Apr 19 12:38:03.472207 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.472181 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-ljkqx"] Apr 19 12:38:03.474695 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:38:03.474666 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36f68a4b_9bad_4638_897d_1244c4eef64d.slice/crio-7c0d52da54c8e006d1c6b65d292085f4e5b7b76e04edc1aa65c1b37674879aff WatchSource:0}: Error finding container 7c0d52da54c8e006d1c6b65d292085f4e5b7b76e04edc1aa65c1b37674879aff: Status 404 returned error can't find the container with id 7c0d52da54c8e006d1c6b65d292085f4e5b7b76e04edc1aa65c1b37674879aff Apr 19 12:38:03.476465 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:03.476449 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:38:04.303422 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:04.303376 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-ljkqx" event={"ID":"36f68a4b-9bad-4638-897d-1244c4eef64d","Type":"ContainerStarted","Data":"7c0d52da54c8e006d1c6b65d292085f4e5b7b76e04edc1aa65c1b37674879aff"} Apr 19 12:38:07.312436 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:07.312392 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-ljkqx" event={"ID":"36f68a4b-9bad-4638-897d-1244c4eef64d","Type":"ContainerStarted","Data":"86e0785f9b66d74217d41e1bd8292362127e0c2af1c86ea91e401632ac420dc7"} Apr 19 12:38:07.327254 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:07.327209 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-ljkqx" podStartSLOduration=1.269329544 podStartE2EDuration="4.327196397s" podCreationTimestamp="2026-04-19 12:38:03 +0000 UTC" firstStartedPulling="2026-04-19 12:38:03.476601025 +0000 UTC m=+446.994758616" lastFinishedPulling="2026-04-19 12:38:06.534467875 +0000 UTC m=+450.052625469" observedRunningTime="2026-04-19 12:38:07.325869327 +0000 UTC m=+450.844026942" watchObservedRunningTime="2026-04-19 12:38:07.327196397 +0000 UTC m=+450.845354011" Apr 19 12:38:14.553347 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.553318 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d"] Apr 19 12:38:14.556422 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.556405 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.560195 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.560179 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 19 12:38:14.560591 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.560577 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 19 12:38:14.560884 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.560868 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-scmp2\"" Apr 19 12:38:14.561090 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.561074 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 19 12:38:14.561203 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.561074 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 19 12:38:14.587207 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.587185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d"] Apr 19 12:38:14.750889 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.750821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-596q8\" (UniqueName: \"kubernetes.io/projected/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-kube-api-access-596q8\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.750889 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.750870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.751035 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.750927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.852125 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.852047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-596q8\" (UniqueName: \"kubernetes.io/projected/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-kube-api-access-596q8\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.852210 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.852175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.852258 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.852221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.854754 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.854733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.854842 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.854786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.861066 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.861046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-596q8\" (UniqueName: \"kubernetes.io/projected/ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a-kube-api-access-596q8\") pod \"opendatahub-operator-controller-manager-9ff869b6b-c4c6d\" (UID: \"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.865839 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.865822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:14.987580 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:14.987550 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d"] Apr 19 12:38:14.990464 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:38:14.990436 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea7e6b0e_1e17_4724_8d3b_6847c1c98f5a.slice/crio-e859000c4e9f530b1a29033e1d88c1936461c5d3f7c1f37ebbed43232294e1f8 WatchSource:0}: Error finding container e859000c4e9f530b1a29033e1d88c1936461c5d3f7c1f37ebbed43232294e1f8: Status 404 returned error can't find the container with id e859000c4e9f530b1a29033e1d88c1936461c5d3f7c1f37ebbed43232294e1f8 Apr 19 12:38:15.333086 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:15.333054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" event={"ID":"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a","Type":"ContainerStarted","Data":"e859000c4e9f530b1a29033e1d88c1936461c5d3f7c1f37ebbed43232294e1f8"} Apr 19 12:38:18.342649 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:18.342616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" event={"ID":"ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a","Type":"ContainerStarted","Data":"f669de70a3e62708989cd032a38ec3d5c282630bb250b67bc928d71e9ed872a6"} Apr 19 12:38:18.343029 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:18.342772 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:18.363090 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:18.363042 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" podStartSLOduration=1.834518759 podStartE2EDuration="4.363030139s" podCreationTimestamp="2026-04-19 12:38:14 +0000 UTC" firstStartedPulling="2026-04-19 12:38:14.992439383 +0000 UTC m=+458.510597002" lastFinishedPulling="2026-04-19 12:38:17.520950779 +0000 UTC m=+461.039108382" observedRunningTime="2026-04-19 12:38:18.361406335 +0000 UTC m=+461.879563945" watchObservedRunningTime="2026-04-19 12:38:18.363030139 +0000 UTC m=+461.881187751" Apr 19 12:38:29.347715 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:29.347685 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-c4c6d" Apr 19 12:38:35.178246 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.178215 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-2j694"] Apr 19 12:38:35.240890 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.240860 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-2j694"] Apr 19 12:38:35.241032 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.240968 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:35.243274 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.243250 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 19 12:38:35.243445 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.243428 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-gvp4r\"" Apr 19 12:38:35.281818 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.281794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zst\" (UniqueName: \"kubernetes.io/projected/e46ceb33-f1df-4349-8d92-6c1636f2fb98-kube-api-access-c8zst\") pod \"odh-model-controller-858dbf95b8-2j694\" (UID: \"e46ceb33-f1df-4349-8d92-6c1636f2fb98\") " pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:35.281909 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.281838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert\") pod \"odh-model-controller-858dbf95b8-2j694\" (UID: \"e46ceb33-f1df-4349-8d92-6c1636f2fb98\") " pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:35.382525 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.382494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zst\" (UniqueName: \"kubernetes.io/projected/e46ceb33-f1df-4349-8d92-6c1636f2fb98-kube-api-access-c8zst\") pod \"odh-model-controller-858dbf95b8-2j694\" (UID: \"e46ceb33-f1df-4349-8d92-6c1636f2fb98\") " pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:35.382631 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.382552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert\") pod \"odh-model-controller-858dbf95b8-2j694\" (UID: \"e46ceb33-f1df-4349-8d92-6c1636f2fb98\") " pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:35.382672 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:38:35.382659 2578 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 19 12:38:35.382716 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:38:35.382704 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert podName:e46ceb33-f1df-4349-8d92-6c1636f2fb98 nodeName:}" failed. No retries permitted until 2026-04-19 12:38:35.882685294 +0000 UTC m=+479.400842884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert") pod "odh-model-controller-858dbf95b8-2j694" (UID: "e46ceb33-f1df-4349-8d92-6c1636f2fb98") : secret "odh-model-controller-webhook-cert" not found Apr 19 12:38:35.392751 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.392725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zst\" (UniqueName: \"kubernetes.io/projected/e46ceb33-f1df-4349-8d92-6c1636f2fb98-kube-api-access-c8zst\") pod \"odh-model-controller-858dbf95b8-2j694\" (UID: \"e46ceb33-f1df-4349-8d92-6c1636f2fb98\") " pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:35.887190 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:35.887153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert\") pod \"odh-model-controller-858dbf95b8-2j694\" (UID: \"e46ceb33-f1df-4349-8d92-6c1636f2fb98\") " pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:35.887355 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:38:35.887275 2578 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 19 12:38:35.887355 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:38:35.887325 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert podName:e46ceb33-f1df-4349-8d92-6c1636f2fb98 nodeName:}" failed. No retries permitted until 2026-04-19 12:38:36.887311878 +0000 UTC m=+480.405469469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert") pod "odh-model-controller-858dbf95b8-2j694" (UID: "e46ceb33-f1df-4349-8d92-6c1636f2fb98") : secret "odh-model-controller-webhook-cert" not found Apr 19 12:38:36.896171 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:36.896140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert\") pod \"odh-model-controller-858dbf95b8-2j694\" (UID: \"e46ceb33-f1df-4349-8d92-6c1636f2fb98\") " pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:36.898601 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:36.898582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e46ceb33-f1df-4349-8d92-6c1636f2fb98-cert\") pod \"odh-model-controller-858dbf95b8-2j694\" (UID: \"e46ceb33-f1df-4349-8d92-6c1636f2fb98\") " pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:37.053449 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:37.053424 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-gvp4r\"" Apr 19 12:38:37.061335 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:37.061310 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:37.180732 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:37.180698 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-2j694"] Apr 19 12:38:37.183506 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:38:37.183462 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode46ceb33_f1df_4349_8d92_6c1636f2fb98.slice/crio-a987b5bd64c40adfb3324e3a6cb87bc0a3bbd6ee88bb9da39aa5e005e4ab47ac WatchSource:0}: Error finding container a987b5bd64c40adfb3324e3a6cb87bc0a3bbd6ee88bb9da39aa5e005e4ab47ac: Status 404 returned error can't find the container with id a987b5bd64c40adfb3324e3a6cb87bc0a3bbd6ee88bb9da39aa5e005e4ab47ac Apr 19 12:38:37.390359 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:37.390322 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-2j694" event={"ID":"e46ceb33-f1df-4349-8d92-6c1636f2fb98","Type":"ContainerStarted","Data":"a987b5bd64c40adfb3324e3a6cb87bc0a3bbd6ee88bb9da39aa5e005e4ab47ac"} Apr 19 12:38:40.400001 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.399966 2578 generic.go:358] "Generic (PLEG): container finished" podID="e46ceb33-f1df-4349-8d92-6c1636f2fb98" containerID="29d2e9cadfe76cef9de8e77ec9441eb5edebd6a3f5147ef6545de31be438e807" exitCode=1 Apr 19 12:38:40.400381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.400016 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-2j694" event={"ID":"e46ceb33-f1df-4349-8d92-6c1636f2fb98","Type":"ContainerDied","Data":"29d2e9cadfe76cef9de8e77ec9441eb5edebd6a3f5147ef6545de31be438e807"} Apr 19 12:38:40.400381 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.400221 2578 scope.go:117] "RemoveContainer" containerID="29d2e9cadfe76cef9de8e77ec9441eb5edebd6a3f5147ef6545de31be438e807" Apr 19 12:38:40.590777 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.590747 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7j52c"] Apr 19 12:38:40.593801 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.593784 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:40.595866 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.595845 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 19 12:38:40.595983 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.595909 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-fw474\"" Apr 19 12:38:40.602040 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.602015 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7j52c"] Apr 19 12:38:40.624640 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.624617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxnct\" (UniqueName: \"kubernetes.io/projected/3792cd68-3123-4e65-bfd6-a57c6528d028-kube-api-access-bxnct\") pod \"kserve-controller-manager-856948b99f-7j52c\" (UID: \"3792cd68-3123-4e65-bfd6-a57c6528d028\") " pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:40.624705 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.624692 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3792cd68-3123-4e65-bfd6-a57c6528d028-cert\") pod \"kserve-controller-manager-856948b99f-7j52c\" (UID: \"3792cd68-3123-4e65-bfd6-a57c6528d028\") " pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:40.725472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.725441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3792cd68-3123-4e65-bfd6-a57c6528d028-cert\") pod \"kserve-controller-manager-856948b99f-7j52c\" (UID: \"3792cd68-3123-4e65-bfd6-a57c6528d028\") " pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:40.725613 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.725523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxnct\" (UniqueName: \"kubernetes.io/projected/3792cd68-3123-4e65-bfd6-a57c6528d028-kube-api-access-bxnct\") pod \"kserve-controller-manager-856948b99f-7j52c\" (UID: \"3792cd68-3123-4e65-bfd6-a57c6528d028\") " pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:40.725613 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:38:40.725578 2578 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 19 12:38:40.725681 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:38:40.725635 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3792cd68-3123-4e65-bfd6-a57c6528d028-cert podName:3792cd68-3123-4e65-bfd6-a57c6528d028 nodeName:}" failed. No retries permitted until 2026-04-19 12:38:41.225620077 +0000 UTC m=+484.743777667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3792cd68-3123-4e65-bfd6-a57c6528d028-cert") pod "kserve-controller-manager-856948b99f-7j52c" (UID: "3792cd68-3123-4e65-bfd6-a57c6528d028") : secret "kserve-webhook-server-cert" not found Apr 19 12:38:40.735077 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:40.735059 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxnct\" (UniqueName: \"kubernetes.io/projected/3792cd68-3123-4e65-bfd6-a57c6528d028-kube-api-access-bxnct\") pod \"kserve-controller-manager-856948b99f-7j52c\" (UID: \"3792cd68-3123-4e65-bfd6-a57c6528d028\") " pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:41.229699 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:41.229667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3792cd68-3123-4e65-bfd6-a57c6528d028-cert\") pod \"kserve-controller-manager-856948b99f-7j52c\" (UID: \"3792cd68-3123-4e65-bfd6-a57c6528d028\") " pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:41.232144 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:41.232117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3792cd68-3123-4e65-bfd6-a57c6528d028-cert\") pod \"kserve-controller-manager-856948b99f-7j52c\" (UID: \"3792cd68-3123-4e65-bfd6-a57c6528d028\") " pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:41.404491 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:41.404433 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-2j694" event={"ID":"e46ceb33-f1df-4349-8d92-6c1636f2fb98","Type":"ContainerStarted","Data":"7412a33a94dd36641f93441c5aea7e837e753a2e3b0cef0a41749e18beea3bc5"} Apr 19 12:38:41.404932 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:41.404551 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:41.430198 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:41.430148 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-2j694" podStartSLOduration=2.98951081 podStartE2EDuration="6.430135279s" podCreationTimestamp="2026-04-19 12:38:35 +0000 UTC" firstStartedPulling="2026-04-19 12:38:37.184938362 +0000 UTC m=+480.703095953" lastFinishedPulling="2026-04-19 12:38:40.625562828 +0000 UTC m=+484.143720422" observedRunningTime="2026-04-19 12:38:41.429882359 +0000 UTC m=+484.948039973" watchObservedRunningTime="2026-04-19 12:38:41.430135279 +0000 UTC m=+484.948292894" Apr 19 12:38:41.508948 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:41.508880 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:41.630267 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:41.630243 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7j52c"] Apr 19 12:38:41.632456 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:38:41.632430 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3792cd68_3123_4e65_bfd6_a57c6528d028.slice/crio-6374e828cc8a61d6fd62c9a44c67f721af16646a8f97ec7bf90c39a6aea112a5 WatchSource:0}: Error finding container 6374e828cc8a61d6fd62c9a44c67f721af16646a8f97ec7bf90c39a6aea112a5: Status 404 returned error can't find the container with id 6374e828cc8a61d6fd62c9a44c67f721af16646a8f97ec7bf90c39a6aea112a5 Apr 19 12:38:42.409000 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:42.408959 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" event={"ID":"3792cd68-3123-4e65-bfd6-a57c6528d028","Type":"ContainerStarted","Data":"6374e828cc8a61d6fd62c9a44c67f721af16646a8f97ec7bf90c39a6aea112a5"} Apr 19 12:38:43.029646 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.029611 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq"] Apr 19 12:38:43.032783 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.032762 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.034919 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.034891 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 19 12:38:43.035866 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.035820 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 19 12:38:43.035866 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.035834 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 19 12:38:43.036024 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.035883 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 19 12:38:43.036024 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.035882 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-sxwn7\"" Apr 19 12:38:43.042529 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.042508 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq"] Apr 19 12:38:43.143448 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.143421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7ht\" (UniqueName: \"kubernetes.io/projected/c1bac5ee-efd8-4476-900d-964769a87ad2-kube-api-access-xm7ht\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.143623 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.143458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1bac5ee-efd8-4476-900d-964769a87ad2-tmp\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.143623 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.143502 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1bac5ee-efd8-4476-900d-964769a87ad2-tls-certs\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.244816 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.244789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7ht\" (UniqueName: \"kubernetes.io/projected/c1bac5ee-efd8-4476-900d-964769a87ad2-kube-api-access-xm7ht\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.244973 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.244822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1bac5ee-efd8-4476-900d-964769a87ad2-tmp\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.244973 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.244841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1bac5ee-efd8-4476-900d-964769a87ad2-tls-certs\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.246948 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.246922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1bac5ee-efd8-4476-900d-964769a87ad2-tmp\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.247131 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.247115 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1bac5ee-efd8-4476-900d-964769a87ad2-tls-certs\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.251720 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.251696 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7ht\" (UniqueName: \"kubernetes.io/projected/c1bac5ee-efd8-4476-900d-964769a87ad2-kube-api-access-xm7ht\") pod \"kube-auth-proxy-874cdfcc7-g9xcq\" (UID: \"c1bac5ee-efd8-4476-900d-964769a87ad2\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.345107 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.345046 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" Apr 19 12:38:43.474086 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:43.474042 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq"] Apr 19 12:38:43.477609 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:38:43.477576 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1bac5ee_efd8_4476_900d_964769a87ad2.slice/crio-ab73c94d4b9d85173bcbb4b8d06a066bcf2a9ad380c56cee8ce8376111dcd8ac WatchSource:0}: Error finding container ab73c94d4b9d85173bcbb4b8d06a066bcf2a9ad380c56cee8ce8376111dcd8ac: Status 404 returned error can't find the container with id ab73c94d4b9d85173bcbb4b8d06a066bcf2a9ad380c56cee8ce8376111dcd8ac Apr 19 12:38:44.418212 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:44.418177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" event={"ID":"c1bac5ee-efd8-4476-900d-964769a87ad2","Type":"ContainerStarted","Data":"ab73c94d4b9d85173bcbb4b8d06a066bcf2a9ad380c56cee8ce8376111dcd8ac"} Apr 19 12:38:44.420085 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:44.420045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" event={"ID":"3792cd68-3123-4e65-bfd6-a57c6528d028","Type":"ContainerStarted","Data":"f32de5ff8f0c96d9a1def6389736942105e5fb97c48c29a15f216ef27b9399d4"} Apr 19 12:38:44.420429 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:44.420240 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:38:44.439362 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:44.439137 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" podStartSLOduration=1.950259338 podStartE2EDuration="4.439120098s" podCreationTimestamp="2026-04-19 12:38:40 +0000 UTC" firstStartedPulling="2026-04-19 12:38:41.633721392 +0000 UTC m=+485.151878984" lastFinishedPulling="2026-04-19 12:38:44.122582152 +0000 UTC m=+487.640739744" observedRunningTime="2026-04-19 12:38:44.437867728 +0000 UTC m=+487.956025354" watchObservedRunningTime="2026-04-19 12:38:44.439120098 +0000 UTC m=+487.957277712" Apr 19 12:38:47.430868 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:47.430832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" event={"ID":"c1bac5ee-efd8-4476-900d-964769a87ad2","Type":"ContainerStarted","Data":"7b565e3bfb813626d6a0c0261fdcf97819510632c8c6818db4760998b5593d20"} Apr 19 12:38:47.447316 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:47.447266 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-g9xcq" podStartSLOduration=1.363008885 podStartE2EDuration="4.447254502s" podCreationTimestamp="2026-04-19 12:38:43 +0000 UTC" firstStartedPulling="2026-04-19 12:38:43.479724079 +0000 UTC m=+486.997881671" lastFinishedPulling="2026-04-19 12:38:46.56396968 +0000 UTC m=+490.082127288" observedRunningTime="2026-04-19 12:38:47.445130905 +0000 UTC m=+490.963288518" watchObservedRunningTime="2026-04-19 12:38:47.447254502 +0000 UTC m=+490.965412122" Apr 19 12:38:52.411023 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:52.410991 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-2j694" Apr 19 12:38:58.396578 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.396544 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-h79h4"] Apr 19 12:38:58.399755 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.399738 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:38:58.403230 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.403207 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 19 12:38:58.403542 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.403524 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 19 12:38:58.403625 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.403534 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-g5cmn\"" Apr 19 12:38:58.412134 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.412112 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-h79h4"] Apr 19 12:38:58.451960 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.451932 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj9wm\" (UniqueName: \"kubernetes.io/projected/4a2b8762-d7f6-4167-ae53-61ec494109b3-kube-api-access-sj9wm\") pod \"servicemesh-operator3-55f49c5f94-h79h4\" (UID: \"4a2b8762-d7f6-4167-ae53-61ec494109b3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:38:58.452065 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.451979 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4a2b8762-d7f6-4167-ae53-61ec494109b3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-h79h4\" (UID: \"4a2b8762-d7f6-4167-ae53-61ec494109b3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:38:58.553003 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.552966 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj9wm\" (UniqueName: \"kubernetes.io/projected/4a2b8762-d7f6-4167-ae53-61ec494109b3-kube-api-access-sj9wm\") pod \"servicemesh-operator3-55f49c5f94-h79h4\" (UID: \"4a2b8762-d7f6-4167-ae53-61ec494109b3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:38:58.553122 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.553029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4a2b8762-d7f6-4167-ae53-61ec494109b3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-h79h4\" (UID: \"4a2b8762-d7f6-4167-ae53-61ec494109b3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:38:58.555588 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.555567 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4a2b8762-d7f6-4167-ae53-61ec494109b3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-h79h4\" (UID: \"4a2b8762-d7f6-4167-ae53-61ec494109b3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:38:58.564951 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.564933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj9wm\" (UniqueName: \"kubernetes.io/projected/4a2b8762-d7f6-4167-ae53-61ec494109b3-kube-api-access-sj9wm\") pod \"servicemesh-operator3-55f49c5f94-h79h4\" (UID: \"4a2b8762-d7f6-4167-ae53-61ec494109b3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:38:58.708719 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.708689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:38:58.838223 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:58.838201 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-h79h4"] Apr 19 12:38:58.841168 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:38:58.841143 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2b8762_d7f6_4167_ae53_61ec494109b3.slice/crio-df6ad8e2274916cecf608dbad7513b5fd07df2118dddf6ebb24aeec8eddf3500 WatchSource:0}: Error finding container df6ad8e2274916cecf608dbad7513b5fd07df2118dddf6ebb24aeec8eddf3500: Status 404 returned error can't find the container with id df6ad8e2274916cecf608dbad7513b5fd07df2118dddf6ebb24aeec8eddf3500 Apr 19 12:38:59.466084 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:38:59.466034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" event={"ID":"4a2b8762-d7f6-4167-ae53-61ec494109b3","Type":"ContainerStarted","Data":"df6ad8e2274916cecf608dbad7513b5fd07df2118dddf6ebb24aeec8eddf3500"} Apr 19 12:39:01.473571 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:01.473535 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" event={"ID":"4a2b8762-d7f6-4167-ae53-61ec494109b3","Type":"ContainerStarted","Data":"f600e976d77ffe39e69d49edc5b72a1d061b3c3789dc5fddcf3582356cf7328b"} Apr 19 12:39:01.473956 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:01.473717 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:39:01.493673 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:01.493628 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" podStartSLOduration=1.369852823 podStartE2EDuration="3.493613944s" podCreationTimestamp="2026-04-19 12:38:58 +0000 UTC" firstStartedPulling="2026-04-19 12:38:58.844407292 +0000 UTC m=+502.362564887" lastFinishedPulling="2026-04-19 12:39:00.968168414 +0000 UTC m=+504.486326008" observedRunningTime="2026-04-19 12:39:01.491786681 +0000 UTC m=+505.009944372" watchObservedRunningTime="2026-04-19 12:39:01.493613944 +0000 UTC m=+505.011771558" Apr 19 12:39:12.479055 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:12.479026 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-h79h4" Apr 19 12:39:14.599351 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.599316 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d"] Apr 19 12:39:14.651821 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.651792 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d"] Apr 19 12:39:14.651976 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.651922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.654344 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.654313 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 19 12:39:14.654511 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.654395 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 19 12:39:14.654511 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.654474 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 19 12:39:14.654652 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.654581 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-fk5pt\"" Apr 19 12:39:14.654652 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.654611 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 19 12:39:14.770161 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.770085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf7m8\" (UniqueName: \"kubernetes.io/projected/94ddfebb-3e22-4687-ac70-95168adb6c6c-kube-api-access-zf7m8\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.770161 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.770125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/94ddfebb-3e22-4687-ac70-95168adb6c6c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.770320 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.770195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.770320 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.770232 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.770320 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.770278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.770421 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.770344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.770460 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.770413 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.871718 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.871688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.871869 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.871740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.871941 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.871861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf7m8\" (UniqueName: \"kubernetes.io/projected/94ddfebb-3e22-4687-ac70-95168adb6c6c-kube-api-access-zf7m8\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.871941 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.871902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/94ddfebb-3e22-4687-ac70-95168adb6c6c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.872041 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.871941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.872041 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.871967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.872041 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.872014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.872856 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.872827 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.874382 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.874356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.874514 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.874420 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.874580 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.874516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/94ddfebb-3e22-4687-ac70-95168adb6c6c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.874636 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.874593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.879327 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.879302 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf7m8\" (UniqueName: \"kubernetes.io/projected/94ddfebb-3e22-4687-ac70-95168adb6c6c-kube-api-access-zf7m8\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.879417 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.879354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/94ddfebb-3e22-4687-ac70-95168adb6c6c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8772d\" (UID: \"94ddfebb-3e22-4687-ac70-95168adb6c6c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:14.961643 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:14.961617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:15.091439 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:15.091412 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d"] Apr 19 12:39:15.095160 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:39:15.095115 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94ddfebb_3e22_4687_ac70_95168adb6c6c.slice/crio-b39d724a5cdacf6262f2646c9d385f8d1bd30db041ece76f584f40b001fb79b7 WatchSource:0}: Error finding container b39d724a5cdacf6262f2646c9d385f8d1bd30db041ece76f584f40b001fb79b7: Status 404 returned error can't find the container with id b39d724a5cdacf6262f2646c9d385f8d1bd30db041ece76f584f40b001fb79b7 Apr 19 12:39:15.430318 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:15.430290 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-7j52c" Apr 19 12:39:15.515569 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:15.515536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" event={"ID":"94ddfebb-3e22-4687-ac70-95168adb6c6c","Type":"ContainerStarted","Data":"b39d724a5cdacf6262f2646c9d385f8d1bd30db041ece76f584f40b001fb79b7"} Apr 19 12:39:17.404936 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:17.404897 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 19 12:39:17.405225 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:17.404978 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 19 12:39:17.524047 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:17.524012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" event={"ID":"94ddfebb-3e22-4687-ac70-95168adb6c6c","Type":"ContainerStarted","Data":"08d9562aa146f4ce5f14bcc5302137ef62bcb73c844e7e2b26aeb78cbfcf0c72"} Apr 19 12:39:17.524253 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:17.524132 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:39:17.541874 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:17.541819 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" podStartSLOduration=1.234315712 podStartE2EDuration="3.541804638s" podCreationTimestamp="2026-04-19 12:39:14 +0000 UTC" firstStartedPulling="2026-04-19 12:39:15.097187439 +0000 UTC m=+518.615345030" lastFinishedPulling="2026-04-19 12:39:17.404676362 +0000 UTC m=+520.922833956" observedRunningTime="2026-04-19 12:39:17.540612981 +0000 UTC m=+521.058770593" watchObservedRunningTime="2026-04-19 12:39:17.541804638 +0000 UTC m=+521.059962252" Apr 19 12:39:18.530415 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:18.530377 2578 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-8772d container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 19 12:39:18.530802 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:18.530431 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" podUID="94ddfebb-3e22-4687-ac70-95168adb6c6c" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:39:21.529581 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:39:21.529546 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8772d" Apr 19 12:40:11.292603 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.292566 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd"] Apr 19 12:40:11.298779 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.298762 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:11.301640 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.301615 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:40:11.301743 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.301652 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:40:11.302650 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.302634 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-2j2k4\"" Apr 19 12:40:11.313386 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.313364 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd"] Apr 19 12:40:11.385787 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.385752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9tl\" (UniqueName: \"kubernetes.io/projected/5d2d57ca-f982-48a6-8769-e0ad9c665d83-kube-api-access-sp9tl\") pod \"limitador-operator-controller-manager-85c4996f8c-rsxwd\" (UID: \"5d2d57ca-f982-48a6-8769-e0ad9c665d83\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:11.486100 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.486071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9tl\" (UniqueName: \"kubernetes.io/projected/5d2d57ca-f982-48a6-8769-e0ad9c665d83-kube-api-access-sp9tl\") pod \"limitador-operator-controller-manager-85c4996f8c-rsxwd\" (UID: \"5d2d57ca-f982-48a6-8769-e0ad9c665d83\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:11.499785 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.499758 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9tl\" (UniqueName: \"kubernetes.io/projected/5d2d57ca-f982-48a6-8769-e0ad9c665d83-kube-api-access-sp9tl\") pod \"limitador-operator-controller-manager-85c4996f8c-rsxwd\" (UID: \"5d2d57ca-f982-48a6-8769-e0ad9c665d83\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:11.608889 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.608824 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:11.728254 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:11.728232 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd"] Apr 19 12:40:11.732451 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:40:11.732180 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d2d57ca_f982_48a6_8769_e0ad9c665d83.slice/crio-2f960736eba716a6fa09605e2094c76497e57ea8dc72e86ece531722ff81695c WatchSource:0}: Error finding container 2f960736eba716a6fa09605e2094c76497e57ea8dc72e86ece531722ff81695c: Status 404 returned error can't find the container with id 2f960736eba716a6fa09605e2094c76497e57ea8dc72e86ece531722ff81695c Apr 19 12:40:12.699661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:12.699620 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" event={"ID":"5d2d57ca-f982-48a6-8769-e0ad9c665d83","Type":"ContainerStarted","Data":"2f960736eba716a6fa09605e2094c76497e57ea8dc72e86ece531722ff81695c"} Apr 19 12:40:14.707633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:14.707541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" event={"ID":"5d2d57ca-f982-48a6-8769-e0ad9c665d83","Type":"ContainerStarted","Data":"4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b"} Apr 19 12:40:14.707633 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:14.707606 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:14.723718 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:14.723667 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" podStartSLOduration=1.68113337 podStartE2EDuration="3.723652922s" podCreationTimestamp="2026-04-19 12:40:11 +0000 UTC" firstStartedPulling="2026-04-19 12:40:11.734590011 +0000 UTC m=+575.252747603" lastFinishedPulling="2026-04-19 12:40:13.777109563 +0000 UTC m=+577.295267155" observedRunningTime="2026-04-19 12:40:14.721836433 +0000 UTC m=+578.239994050" watchObservedRunningTime="2026-04-19 12:40:14.723652922 +0000 UTC m=+578.241810536" Apr 19 12:40:25.712918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:25.712890 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:33.297917 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.297880 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6"] Apr 19 12:40:33.301523 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.301502 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:33.304063 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.304041 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-wnc7s\"" Apr 19 12:40:33.315941 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.315914 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6"] Apr 19 12:40:33.347577 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.347555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdqm8\" (UniqueName: \"kubernetes.io/projected/0761232e-bcc4-421b-b016-8ce8e654a465-kube-api-access-cdqm8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" (UID: \"0761232e-bcc4-421b-b016-8ce8e654a465\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:33.347666 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.347599 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0761232e-bcc4-421b-b016-8ce8e654a465-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" (UID: \"0761232e-bcc4-421b-b016-8ce8e654a465\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:33.448162 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.448137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdqm8\" (UniqueName: \"kubernetes.io/projected/0761232e-bcc4-421b-b016-8ce8e654a465-kube-api-access-cdqm8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" (UID: \"0761232e-bcc4-421b-b016-8ce8e654a465\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:33.448267 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.448175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0761232e-bcc4-421b-b016-8ce8e654a465-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" (UID: \"0761232e-bcc4-421b-b016-8ce8e654a465\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:33.448512 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.448471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0761232e-bcc4-421b-b016-8ce8e654a465-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" (UID: \"0761232e-bcc4-421b-b016-8ce8e654a465\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:33.456865 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.456847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdqm8\" (UniqueName: \"kubernetes.io/projected/0761232e-bcc4-421b-b016-8ce8e654a465-kube-api-access-cdqm8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" (UID: \"0761232e-bcc4-421b-b016-8ce8e654a465\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:33.610919 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.610856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:33.733951 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.733921 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6"] Apr 19 12:40:33.737110 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:40:33.737081 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0761232e_bcc4_421b_b016_8ce8e654a465.slice/crio-5b3682927d553b64868195426ea6579eb1ba9ee24e66efdbebaa0bc55481b456 WatchSource:0}: Error finding container 5b3682927d553b64868195426ea6579eb1ba9ee24e66efdbebaa0bc55481b456: Status 404 returned error can't find the container with id 5b3682927d553b64868195426ea6579eb1ba9ee24e66efdbebaa0bc55481b456 Apr 19 12:40:33.772654 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.772621 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" event={"ID":"0761232e-bcc4-421b-b016-8ce8e654a465","Type":"ContainerStarted","Data":"5b3682927d553b64868195426ea6579eb1ba9ee24e66efdbebaa0bc55481b456"} Apr 19 12:40:33.991628 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.991600 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6"] Apr 19 12:40:33.999173 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:33.999145 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6"] Apr 19 12:40:34.017798 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.017768 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw"] Apr 19 12:40:34.022247 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.022226 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd"] Apr 19 12:40:34.022456 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.022379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:34.023305 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.022885 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" containerName="manager" containerID="cri-o://4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b" gracePeriod=2 Apr 19 12:40:34.025326 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.024705 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd"] Apr 19 12:40:34.031504 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.031460 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw"] Apr 19 12:40:34.047808 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.047782 2578 status_manager.go:895] "Failed to get status for pod" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-rsxwd\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:34.048160 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.048146 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf"] Apr 19 12:40:34.048441 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.048426 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" containerName="manager" Apr 19 12:40:34.048535 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.048444 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" containerName="manager" Apr 19 12:40:34.048588 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.048547 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" containerName="manager" Apr 19 12:40:34.051258 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.051241 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" Apr 19 12:40:34.052873 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.052852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw\" (UID: \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:34.052968 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.052906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnxm6\" (UniqueName: \"kubernetes.io/projected/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-kube-api-access-rnxm6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw\" (UID: \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:34.070558 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.066652 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf"] Apr 19 12:40:34.079538 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.079506 2578 status_manager.go:895] "Failed to get status for pod" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-rsxwd\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:34.153828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.153796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-289ft\" (UniqueName: \"kubernetes.io/projected/70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4-kube-api-access-289ft\") pod \"limitador-operator-controller-manager-85c4996f8c-kj6nf\" (UID: \"70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" Apr 19 12:40:34.153941 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.153850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw\" (UID: \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:34.153941 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.153929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnxm6\" (UniqueName: \"kubernetes.io/projected/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-kube-api-access-rnxm6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw\" (UID: \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:34.154258 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.154236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw\" (UID: \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:34.162765 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.162740 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnxm6\" (UniqueName: \"kubernetes.io/projected/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-kube-api-access-rnxm6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw\" (UID: \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:34.255380 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.255267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-289ft\" (UniqueName: \"kubernetes.io/projected/70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4-kube-api-access-289ft\") pod \"limitador-operator-controller-manager-85c4996f8c-kj6nf\" (UID: \"70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" Apr 19 12:40:34.263951 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.263924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-289ft\" (UniqueName: \"kubernetes.io/projected/70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4-kube-api-access-289ft\") pod \"limitador-operator-controller-manager-85c4996f8c-kj6nf\" (UID: \"70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" Apr 19 12:40:34.295399 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.295374 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:34.297461 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.297426 2578 status_manager.go:895] "Failed to get status for pod" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-rsxwd\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:34.336702 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.336675 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:34.356626 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.356601 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp9tl\" (UniqueName: \"kubernetes.io/projected/5d2d57ca-f982-48a6-8769-e0ad9c665d83-kube-api-access-sp9tl\") pod \"5d2d57ca-f982-48a6-8769-e0ad9c665d83\" (UID: \"5d2d57ca-f982-48a6-8769-e0ad9c665d83\") " Apr 19 12:40:34.359135 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.359102 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2d57ca-f982-48a6-8769-e0ad9c665d83-kube-api-access-sp9tl" (OuterVolumeSpecName: "kube-api-access-sp9tl") pod "5d2d57ca-f982-48a6-8769-e0ad9c665d83" (UID: "5d2d57ca-f982-48a6-8769-e0ad9c665d83"). InnerVolumeSpecName "kube-api-access-sp9tl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:40:34.398568 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.398525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" Apr 19 12:40:34.458093 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.458055 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sp9tl\" (UniqueName: \"kubernetes.io/projected/5d2d57ca-f982-48a6-8769-e0ad9c665d83-kube-api-access-sp9tl\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:40:34.647174 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.647118 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw"] Apr 19 12:40:34.679005 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:40:34.678948 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc5f64d_e065_417d_83cc_a3aa7683f6ca.slice/crio-d774cd03aee25899b274e434765bd6222a75d39da76b8e46b6018ec5193a00c7 WatchSource:0}: Error finding container d774cd03aee25899b274e434765bd6222a75d39da76b8e46b6018ec5193a00c7: Status 404 returned error can't find the container with id d774cd03aee25899b274e434765bd6222a75d39da76b8e46b6018ec5193a00c7 Apr 19 12:40:34.756327 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.756297 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf"] Apr 19 12:40:34.778208 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.778135 2578 generic.go:358] "Generic (PLEG): container finished" podID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" containerID="4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b" exitCode=0 Apr 19 12:40:34.778208 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.778189 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" Apr 19 12:40:34.778400 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.778229 2578 scope.go:117] "RemoveContainer" containerID="4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b" Apr 19 12:40:34.780533 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.780271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" event={"ID":"4bc5f64d-e065-417d-83cc-a3aa7683f6ca","Type":"ContainerStarted","Data":"d774cd03aee25899b274e434765bd6222a75d39da76b8e46b6018ec5193a00c7"} Apr 19 12:40:34.780533 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.780473 2578 status_manager.go:895] "Failed to get status for pod" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-rsxwd\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:34.786943 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:40:34.786903 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70dd7ed8_aaf8_42f5_9d0e_9f87d1c245a4.slice/crio-45258a935b6bf2e83571a487f4b57980a71664bfabe7ae57c3cacdb2c7b313ba WatchSource:0}: Error finding container 45258a935b6bf2e83571a487f4b57980a71664bfabe7ae57c3cacdb2c7b313ba: Status 404 returned error can't find the container with id 45258a935b6bf2e83571a487f4b57980a71664bfabe7ae57c3cacdb2c7b313ba Apr 19 12:40:34.790928 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.790909 2578 scope.go:117] "RemoveContainer" containerID="4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b" Apr 19 12:40:34.791246 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:40:34.791218 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b\": container with ID starting with 4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b not found: ID does not exist" containerID="4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b" Apr 19 12:40:34.791325 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.791257 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b"} err="failed to get container status \"4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b\": rpc error: code = NotFound desc = could not find container \"4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b\": container with ID starting with 4d7458f98f2cc462fe7cfa6094b5ccd8c83bf66181b8635a79b014488854005b not found: ID does not exist" Apr 19 12:40:34.791419 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:34.791385 2578 status_manager.go:895] "Failed to get status for pod" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rsxwd" err="pods \"limitador-operator-controller-manager-85c4996f8c-rsxwd\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:35.064703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:35.064622 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d2d57ca-f982-48a6-8769-e0ad9c665d83" path="/var/lib/kubelet/pods/5d2d57ca-f982-48a6-8769-e0ad9c665d83/volumes" Apr 19 12:40:35.787099 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:35.787038 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" event={"ID":"70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4","Type":"ContainerStarted","Data":"a1e208212c5adb1f17b9079b4a7b855f99cd40f1f018a28eee8f62ca288ab9eb"} Apr 19 12:40:35.787099 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:35.787079 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" event={"ID":"70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4","Type":"ContainerStarted","Data":"45258a935b6bf2e83571a487f4b57980a71664bfabe7ae57c3cacdb2c7b313ba"} Apr 19 12:40:35.787692 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:35.787199 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" Apr 19 12:40:35.805375 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:35.805326 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" podStartSLOduration=1.80530745 podStartE2EDuration="1.80530745s" podCreationTimestamp="2026-04-19 12:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:40:35.804834956 +0000 UTC m=+599.322992570" watchObservedRunningTime="2026-04-19 12:40:35.80530745 +0000 UTC m=+599.323465065" Apr 19 12:40:37.380442 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:37.380417 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:40:37.380910 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:37.380505 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:40:37.794873 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:37.794836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" event={"ID":"4bc5f64d-e065-417d-83cc-a3aa7683f6ca","Type":"ContainerStarted","Data":"37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81"} Apr 19 12:40:37.795061 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:37.794939 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:40:37.796297 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:37.796271 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" containerName="manager" containerID="cri-o://f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c" gracePeriod=2 Apr 19 12:40:37.818661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:37.818627 2578 status_manager.go:895] "Failed to get status for pod" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:37.819066 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:37.819022 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" podStartSLOduration=2.065570293 podStartE2EDuration="4.819011116s" podCreationTimestamp="2026-04-19 12:40:33 +0000 UTC" firstStartedPulling="2026-04-19 12:40:34.683285889 +0000 UTC m=+598.201443495" lastFinishedPulling="2026-04-19 12:40:37.436726723 +0000 UTC m=+600.954884318" observedRunningTime="2026-04-19 12:40:37.816331624 +0000 UTC m=+601.334489236" watchObservedRunningTime="2026-04-19 12:40:37.819011116 +0000 UTC m=+601.337168729" Apr 19 12:40:38.022949 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.022929 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:38.025112 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.025088 2578 status_manager.go:895] "Failed to get status for pod" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:38.093311 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.093246 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdqm8\" (UniqueName: \"kubernetes.io/projected/0761232e-bcc4-421b-b016-8ce8e654a465-kube-api-access-cdqm8\") pod \"0761232e-bcc4-421b-b016-8ce8e654a465\" (UID: \"0761232e-bcc4-421b-b016-8ce8e654a465\") " Apr 19 12:40:38.093433 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.093323 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0761232e-bcc4-421b-b016-8ce8e654a465-extensions-socket-volume\") pod \"0761232e-bcc4-421b-b016-8ce8e654a465\" (UID: \"0761232e-bcc4-421b-b016-8ce8e654a465\") " Apr 19 12:40:38.093639 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.093613 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0761232e-bcc4-421b-b016-8ce8e654a465-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "0761232e-bcc4-421b-b016-8ce8e654a465" (UID: "0761232e-bcc4-421b-b016-8ce8e654a465"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:40:38.095416 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.095400 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0761232e-bcc4-421b-b016-8ce8e654a465-kube-api-access-cdqm8" (OuterVolumeSpecName: "kube-api-access-cdqm8") pod "0761232e-bcc4-421b-b016-8ce8e654a465" (UID: "0761232e-bcc4-421b-b016-8ce8e654a465"). InnerVolumeSpecName "kube-api-access-cdqm8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:40:38.194199 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.194174 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cdqm8\" (UniqueName: \"kubernetes.io/projected/0761232e-bcc4-421b-b016-8ce8e654a465-kube-api-access-cdqm8\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:40:38.194199 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.194196 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0761232e-bcc4-421b-b016-8ce8e654a465-extensions-socket-volume\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:40:38.800928 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.800838 2578 generic.go:358] "Generic (PLEG): container finished" podID="0761232e-bcc4-421b-b016-8ce8e654a465" containerID="f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c" exitCode=2 Apr 19 12:40:38.800928 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.800889 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" Apr 19 12:40:38.800928 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.800899 2578 scope.go:117] "RemoveContainer" containerID="f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c" Apr 19 12:40:38.802940 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.802915 2578 status_manager.go:895] "Failed to get status for pod" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:38.809163 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.809146 2578 scope.go:117] "RemoveContainer" containerID="f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c" Apr 19 12:40:38.809401 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:40:38.809382 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c\": container with ID starting with f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c not found: ID does not exist" containerID="f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c" Apr 19 12:40:38.809456 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.809410 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c"} err="failed to get container status \"f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c\": rpc error: code = NotFound desc = could not find container \"f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c\": container with ID starting with f0b63d9c4bb4f1907a3913d2bd853a436ce96afd71f5b933146d687b92cb669c not found: ID does not exist" Apr 19 12:40:38.810595 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:38.810573 2578 status_manager.go:895] "Failed to get status for pod" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-cvfc6\" is forbidden: User \"system:node:ip-10-0-129-233.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-233.ec2.internal' and this object" Apr 19 12:40:39.063304 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:39.063243 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" path="/var/lib/kubelet/pods/0761232e-bcc4-421b-b016-8ce8e654a465/volumes" Apr 19 12:40:46.793681 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:46.793652 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kj6nf" Apr 19 12:40:48.802679 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:40:48.802647 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:41:01.458104 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.458014 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw"] Apr 19 12:41:01.458801 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.458743 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" podUID="4bc5f64d-e065-417d-83cc-a3aa7683f6ca" containerName="manager" containerID="cri-o://37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81" gracePeriod=10 Apr 19 12:41:01.700438 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.700417 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:41:01.758038 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.757963 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnxm6\" (UniqueName: \"kubernetes.io/projected/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-kube-api-access-rnxm6\") pod \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\" (UID: \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\") " Apr 19 12:41:01.758038 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.758006 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-extensions-socket-volume\") pod \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\" (UID: \"4bc5f64d-e065-417d-83cc-a3aa7683f6ca\") " Apr 19 12:41:01.758518 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.758472 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "4bc5f64d-e065-417d-83cc-a3aa7683f6ca" (UID: "4bc5f64d-e065-417d-83cc-a3aa7683f6ca"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:41:01.760020 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.759996 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-kube-api-access-rnxm6" (OuterVolumeSpecName: "kube-api-access-rnxm6") pod "4bc5f64d-e065-417d-83cc-a3aa7683f6ca" (UID: "4bc5f64d-e065-417d-83cc-a3aa7683f6ca"). InnerVolumeSpecName "kube-api-access-rnxm6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:41:01.858622 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.858595 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rnxm6\" (UniqueName: \"kubernetes.io/projected/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-kube-api-access-rnxm6\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:41:01.858622 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.858625 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4bc5f64d-e065-417d-83cc-a3aa7683f6ca-extensions-socket-volume\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:41:01.884584 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.884557 2578 generic.go:358] "Generic (PLEG): container finished" podID="4bc5f64d-e065-417d-83cc-a3aa7683f6ca" containerID="37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81" exitCode=0 Apr 19 12:41:01.884677 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.884622 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" event={"ID":"4bc5f64d-e065-417d-83cc-a3aa7683f6ca","Type":"ContainerDied","Data":"37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81"} Apr 19 12:41:01.884677 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.884625 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" Apr 19 12:41:01.884677 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.884661 2578 scope.go:117] "RemoveContainer" containerID="37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81" Apr 19 12:41:01.884813 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.884650 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw" event={"ID":"4bc5f64d-e065-417d-83cc-a3aa7683f6ca","Type":"ContainerDied","Data":"d774cd03aee25899b274e434765bd6222a75d39da76b8e46b6018ec5193a00c7"} Apr 19 12:41:01.892431 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.892410 2578 scope.go:117] "RemoveContainer" containerID="37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81" Apr 19 12:41:01.892715 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:41:01.892690 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81\": container with ID starting with 37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81 not found: ID does not exist" containerID="37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81" Apr 19 12:41:01.892778 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.892722 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81"} err="failed to get container status \"37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81\": rpc error: code = NotFound desc = could not find container \"37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81\": container with ID starting with 37f115297bb2c3fcc612edf0d6a90ccf2d027c5766f04bf02f0090950c78fa81 not found: ID does not exist" Apr 19 12:41:01.904581 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.904552 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw"] Apr 19 12:41:01.910551 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:01.910529 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kzzbw"] Apr 19 12:41:03.063639 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:03.063605 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc5f64d-e065-417d-83cc-a3aa7683f6ca" path="/var/lib/kubelet/pods/4bc5f64d-e065-417d-83cc-a3aa7683f6ca/volumes" Apr 19 12:41:18.333260 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.333222 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-25kfk"] Apr 19 12:41:18.335703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.333552 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" containerName="manager" Apr 19 12:41:18.335703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.333565 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" containerName="manager" Apr 19 12:41:18.335703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.333580 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bc5f64d-e065-417d-83cc-a3aa7683f6ca" containerName="manager" Apr 19 12:41:18.335703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.333585 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc5f64d-e065-417d-83cc-a3aa7683f6ca" containerName="manager" Apr 19 12:41:18.335703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.333628 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0761232e-bcc4-421b-b016-8ce8e654a465" containerName="manager" Apr 19 12:41:18.335703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.333635 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bc5f64d-e065-417d-83cc-a3aa7683f6ca" containerName="manager" Apr 19 12:41:18.336576 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.336560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:18.338892 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.338867 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4cwbr\"" Apr 19 12:41:18.338892 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.338881 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 19 12:41:18.343472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.343442 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-25kfk"] Apr 19 12:41:18.428089 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.428059 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-25kfk"] Apr 19 12:41:18.487827 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.487795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b5fcd156-921f-480c-8cbe-4edf68276cbe-config-file\") pod \"limitador-limitador-7d549b5b-25kfk\" (UID: \"b5fcd156-921f-480c-8cbe-4edf68276cbe\") " pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:18.487827 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.487833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6vk\" (UniqueName: \"kubernetes.io/projected/b5fcd156-921f-480c-8cbe-4edf68276cbe-kube-api-access-9v6vk\") pod \"limitador-limitador-7d549b5b-25kfk\" (UID: \"b5fcd156-921f-480c-8cbe-4edf68276cbe\") " pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:18.588402 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.588338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b5fcd156-921f-480c-8cbe-4edf68276cbe-config-file\") pod \"limitador-limitador-7d549b5b-25kfk\" (UID: \"b5fcd156-921f-480c-8cbe-4edf68276cbe\") " pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:18.588402 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.588369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6vk\" (UniqueName: \"kubernetes.io/projected/b5fcd156-921f-480c-8cbe-4edf68276cbe-kube-api-access-9v6vk\") pod \"limitador-limitador-7d549b5b-25kfk\" (UID: \"b5fcd156-921f-480c-8cbe-4edf68276cbe\") " pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:18.588918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.588900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b5fcd156-921f-480c-8cbe-4edf68276cbe-config-file\") pod \"limitador-limitador-7d549b5b-25kfk\" (UID: \"b5fcd156-921f-480c-8cbe-4edf68276cbe\") " pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:18.596772 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.596746 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6vk\" (UniqueName: \"kubernetes.io/projected/b5fcd156-921f-480c-8cbe-4edf68276cbe-kube-api-access-9v6vk\") pod \"limitador-limitador-7d549b5b-25kfk\" (UID: \"b5fcd156-921f-480c-8cbe-4edf68276cbe\") " pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:18.647314 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.647287 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:18.762565 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.762536 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-25kfk"] Apr 19 12:41:18.765525 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:41:18.765464 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5fcd156_921f_480c_8cbe_4edf68276cbe.slice/crio-1b6ae7845ca667f33bf2aac08a7730828038557b118a95350b49d4074987e0bf WatchSource:0}: Error finding container 1b6ae7845ca667f33bf2aac08a7730828038557b118a95350b49d4074987e0bf: Status 404 returned error can't find the container with id 1b6ae7845ca667f33bf2aac08a7730828038557b118a95350b49d4074987e0bf Apr 19 12:41:18.939296 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:18.939260 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" event={"ID":"b5fcd156-921f-480c-8cbe-4edf68276cbe","Type":"ContainerStarted","Data":"1b6ae7845ca667f33bf2aac08a7730828038557b118a95350b49d4074987e0bf"} Apr 19 12:41:21.951536 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:21.951497 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" event={"ID":"b5fcd156-921f-480c-8cbe-4edf68276cbe","Type":"ContainerStarted","Data":"f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a"} Apr 19 12:41:21.951913 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:21.951612 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:21.968206 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:21.968157 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" podStartSLOduration=1.289867739 podStartE2EDuration="3.968143141s" podCreationTimestamp="2026-04-19 12:41:18 +0000 UTC" firstStartedPulling="2026-04-19 12:41:18.767195348 +0000 UTC m=+642.285352945" lastFinishedPulling="2026-04-19 12:41:21.445470756 +0000 UTC m=+644.963628347" observedRunningTime="2026-04-19 12:41:21.966887932 +0000 UTC m=+645.485045537" watchObservedRunningTime="2026-04-19 12:41:21.968143141 +0000 UTC m=+645.486300751" Apr 19 12:41:32.956275 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:32.956245 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:33.885016 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:33.884983 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-25kfk"] Apr 19 12:41:33.885202 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:33.885177 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" podUID="b5fcd156-921f-480c-8cbe-4edf68276cbe" containerName="limitador" containerID="cri-o://f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a" gracePeriod=30 Apr 19 12:41:34.815812 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.815791 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:34.914353 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.914326 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b5fcd156-921f-480c-8cbe-4edf68276cbe-config-file\") pod \"b5fcd156-921f-480c-8cbe-4edf68276cbe\" (UID: \"b5fcd156-921f-480c-8cbe-4edf68276cbe\") " Apr 19 12:41:34.914513 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.914383 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v6vk\" (UniqueName: \"kubernetes.io/projected/b5fcd156-921f-480c-8cbe-4edf68276cbe-kube-api-access-9v6vk\") pod \"b5fcd156-921f-480c-8cbe-4edf68276cbe\" (UID: \"b5fcd156-921f-480c-8cbe-4edf68276cbe\") " Apr 19 12:41:34.914706 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.914682 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5fcd156-921f-480c-8cbe-4edf68276cbe-config-file" (OuterVolumeSpecName: "config-file") pod "b5fcd156-921f-480c-8cbe-4edf68276cbe" (UID: "b5fcd156-921f-480c-8cbe-4edf68276cbe"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:41:34.916524 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.916501 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5fcd156-921f-480c-8cbe-4edf68276cbe-kube-api-access-9v6vk" (OuterVolumeSpecName: "kube-api-access-9v6vk") pod "b5fcd156-921f-480c-8cbe-4edf68276cbe" (UID: "b5fcd156-921f-480c-8cbe-4edf68276cbe"). InnerVolumeSpecName "kube-api-access-9v6vk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:41:34.973465 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.973391 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-8zr62"] Apr 19 12:41:34.973710 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.973696 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5fcd156-921f-480c-8cbe-4edf68276cbe" containerName="limitador" Apr 19 12:41:34.973710 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.973712 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fcd156-921f-480c-8cbe-4edf68276cbe" containerName="limitador" Apr 19 12:41:34.973790 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.973785 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5fcd156-921f-480c-8cbe-4edf68276cbe" containerName="limitador" Apr 19 12:41:34.976917 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.976894 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:34.979667 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.979433 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 19 12:41:34.979667 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.979637 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-vzgvq\"" Apr 19 12:41:34.987762 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.987713 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-8zr62"] Apr 19 12:41:34.994136 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.994110 2578 generic.go:358] "Generic (PLEG): container finished" podID="b5fcd156-921f-480c-8cbe-4edf68276cbe" containerID="f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a" exitCode=0 Apr 19 12:41:34.994320 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.994259 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" Apr 19 12:41:34.994421 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.994209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" event={"ID":"b5fcd156-921f-480c-8cbe-4edf68276cbe","Type":"ContainerDied","Data":"f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a"} Apr 19 12:41:34.994421 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.994374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-25kfk" event={"ID":"b5fcd156-921f-480c-8cbe-4edf68276cbe","Type":"ContainerDied","Data":"1b6ae7845ca667f33bf2aac08a7730828038557b118a95350b49d4074987e0bf"} Apr 19 12:41:34.994421 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:34.994401 2578 scope.go:117] "RemoveContainer" containerID="f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a" Apr 19 12:41:35.003398 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.003379 2578 scope.go:117] "RemoveContainer" containerID="f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a" Apr 19 12:41:35.003691 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:41:35.003670 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a\": container with ID starting with f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a not found: ID does not exist" containerID="f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a" Apr 19 12:41:35.003767 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.003698 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a"} err="failed to get container status \"f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a\": rpc error: code = NotFound desc = could not find container \"f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a\": container with ID starting with f874bc56a708b3efcb56145d7ce9b50f7d3da07c33cc7423a6339f742d33fb5a not found: ID does not exist" Apr 19 12:41:35.015094 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.015063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/43aeb4ec-2438-4029-98ad-416497c39e00-data\") pod \"postgres-868db5846d-8zr62\" (UID: \"43aeb4ec-2438-4029-98ad-416497c39e00\") " pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:35.015206 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.015186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnmpg\" (UniqueName: \"kubernetes.io/projected/43aeb4ec-2438-4029-98ad-416497c39e00-kube-api-access-wnmpg\") pod \"postgres-868db5846d-8zr62\" (UID: \"43aeb4ec-2438-4029-98ad-416497c39e00\") " pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:35.015262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.015248 2578 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b5fcd156-921f-480c-8cbe-4edf68276cbe-config-file\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:41:35.015301 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.015269 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9v6vk\" (UniqueName: \"kubernetes.io/projected/b5fcd156-921f-480c-8cbe-4edf68276cbe-kube-api-access-9v6vk\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:41:35.015472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.015458 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-25kfk"] Apr 19 12:41:35.021148 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.021128 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-25kfk"] Apr 19 12:41:35.063717 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.063696 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5fcd156-921f-480c-8cbe-4edf68276cbe" path="/var/lib/kubelet/pods/b5fcd156-921f-480c-8cbe-4edf68276cbe/volumes" Apr 19 12:41:35.116594 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.116568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnmpg\" (UniqueName: \"kubernetes.io/projected/43aeb4ec-2438-4029-98ad-416497c39e00-kube-api-access-wnmpg\") pod \"postgres-868db5846d-8zr62\" (UID: \"43aeb4ec-2438-4029-98ad-416497c39e00\") " pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:35.116700 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.116621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/43aeb4ec-2438-4029-98ad-416497c39e00-data\") pod \"postgres-868db5846d-8zr62\" (UID: \"43aeb4ec-2438-4029-98ad-416497c39e00\") " pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:35.116940 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.116924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/43aeb4ec-2438-4029-98ad-416497c39e00-data\") pod \"postgres-868db5846d-8zr62\" (UID: \"43aeb4ec-2438-4029-98ad-416497c39e00\") " pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:35.125661 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.125641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnmpg\" (UniqueName: \"kubernetes.io/projected/43aeb4ec-2438-4029-98ad-416497c39e00-kube-api-access-wnmpg\") pod \"postgres-868db5846d-8zr62\" (UID: \"43aeb4ec-2438-4029-98ad-416497c39e00\") " pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:35.295527 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.295428 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:35.410659 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.410628 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-8zr62"] Apr 19 12:41:35.413543 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:41:35.413508 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43aeb4ec_2438_4029_98ad_416497c39e00.slice/crio-3bbdce7abd346c73b6161542f66bde45a066ab2ba814ed2faae86c176c9c2b4b WatchSource:0}: Error finding container 3bbdce7abd346c73b6161542f66bde45a066ab2ba814ed2faae86c176c9c2b4b: Status 404 returned error can't find the container with id 3bbdce7abd346c73b6161542f66bde45a066ab2ba814ed2faae86c176c9c2b4b Apr 19 12:41:35.999098 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:35.999062 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-8zr62" event={"ID":"43aeb4ec-2438-4029-98ad-416497c39e00","Type":"ContainerStarted","Data":"3bbdce7abd346c73b6161542f66bde45a066ab2ba814ed2faae86c176c9c2b4b"} Apr 19 12:41:41.735918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:41.735895 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 19 12:41:42.022170 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:42.022098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-8zr62" event={"ID":"43aeb4ec-2438-4029-98ad-416497c39e00","Type":"ContainerStarted","Data":"d547a12030b6099288fb2424e6c18caa0917ab3dab43fc40408f1f2bf747266f"} Apr 19 12:41:42.022296 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:42.022215 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:42.037828 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:42.037778 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-8zr62" podStartSLOduration=1.719282372 podStartE2EDuration="8.037763503s" podCreationTimestamp="2026-04-19 12:41:34 +0000 UTC" firstStartedPulling="2026-04-19 12:41:35.414902367 +0000 UTC m=+658.933059961" lastFinishedPulling="2026-04-19 12:41:41.733383481 +0000 UTC m=+665.251541092" observedRunningTime="2026-04-19 12:41:42.035641691 +0000 UTC m=+665.553799304" watchObservedRunningTime="2026-04-19 12:41:42.037763503 +0000 UTC m=+665.555921115" Apr 19 12:41:48.056180 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:48.056126 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-8zr62" Apr 19 12:41:50.777032 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.776996 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-twmlt"] Apr 19 12:41:50.783304 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.783275 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" Apr 19 12:41:50.785849 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.785826 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-6fc62\"" Apr 19 12:41:50.791517 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.791471 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-twmlt"] Apr 19 12:41:50.842905 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.842882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmd6\" (UniqueName: \"kubernetes.io/projected/ab717e66-3cb9-4456-a246-5b46ce0a3d25-kube-api-access-gtmd6\") pod \"maas-controller-6d4c8f55f9-twmlt\" (UID: \"ab717e66-3cb9-4456-a246-5b46ce0a3d25\") " pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" Apr 19 12:41:50.919769 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.919744 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-74d7868b9-gfcd7"] Apr 19 12:41:50.923413 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.923395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:41:50.930702 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.930679 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-74d7868b9-gfcd7"] Apr 19 12:41:50.943518 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.943497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmd6\" (UniqueName: \"kubernetes.io/projected/ab717e66-3cb9-4456-a246-5b46ce0a3d25-kube-api-access-gtmd6\") pod \"maas-controller-6d4c8f55f9-twmlt\" (UID: \"ab717e66-3cb9-4456-a246-5b46ce0a3d25\") " pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" Apr 19 12:41:50.950875 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:50.950857 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmd6\" (UniqueName: \"kubernetes.io/projected/ab717e66-3cb9-4456-a246-5b46ce0a3d25-kube-api-access-gtmd6\") pod \"maas-controller-6d4c8f55f9-twmlt\" (UID: \"ab717e66-3cb9-4456-a246-5b46ce0a3d25\") " pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" Apr 19 12:41:51.030088 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.030028 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-twmlt"] Apr 19 12:41:51.030240 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.030217 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" Apr 19 12:41:51.044122 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.044100 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll96x\" (UniqueName: \"kubernetes.io/projected/1b21855f-f05c-4f87-b9ae-7c8dec4de574-kube-api-access-ll96x\") pod \"maas-controller-74d7868b9-gfcd7\" (UID: \"1b21855f-f05c-4f87-b9ae-7c8dec4de574\") " pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:41:51.055078 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.055055 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-d6fc9457d-vwlj9"] Apr 19 12:41:51.060250 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.060231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:41:51.066396 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.066372 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-d6fc9457d-vwlj9"] Apr 19 12:41:51.145023 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.144994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48b59\" (UniqueName: \"kubernetes.io/projected/c4ca51f1-a1f9-429d-8680-9b297b39f3f4-kube-api-access-48b59\") pod \"maas-controller-d6fc9457d-vwlj9\" (UID: \"c4ca51f1-a1f9-429d-8680-9b297b39f3f4\") " pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:41:51.145180 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.145091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll96x\" (UniqueName: \"kubernetes.io/projected/1b21855f-f05c-4f87-b9ae-7c8dec4de574-kube-api-access-ll96x\") pod \"maas-controller-74d7868b9-gfcd7\" (UID: \"1b21855f-f05c-4f87-b9ae-7c8dec4de574\") " pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:41:51.152574 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.152551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll96x\" (UniqueName: \"kubernetes.io/projected/1b21855f-f05c-4f87-b9ae-7c8dec4de574-kube-api-access-ll96x\") pod \"maas-controller-74d7868b9-gfcd7\" (UID: \"1b21855f-f05c-4f87-b9ae-7c8dec4de574\") " pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:41:51.155129 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.155107 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-twmlt"] Apr 19 12:41:51.157405 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:41:51.157383 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab717e66_3cb9_4456_a246_5b46ce0a3d25.slice/crio-04a422a8abd19b6effacc3debcb03e11d549dc2caa581d5fccd0a5bf1477c465 WatchSource:0}: Error finding container 04a422a8abd19b6effacc3debcb03e11d549dc2caa581d5fccd0a5bf1477c465: Status 404 returned error can't find the container with id 04a422a8abd19b6effacc3debcb03e11d549dc2caa581d5fccd0a5bf1477c465 Apr 19 12:41:51.234218 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.234194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:41:51.246105 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.246078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48b59\" (UniqueName: \"kubernetes.io/projected/c4ca51f1-a1f9-429d-8680-9b297b39f3f4-kube-api-access-48b59\") pod \"maas-controller-d6fc9457d-vwlj9\" (UID: \"c4ca51f1-a1f9-429d-8680-9b297b39f3f4\") " pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:41:51.254672 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.254652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48b59\" (UniqueName: \"kubernetes.io/projected/c4ca51f1-a1f9-429d-8680-9b297b39f3f4-kube-api-access-48b59\") pod \"maas-controller-d6fc9457d-vwlj9\" (UID: \"c4ca51f1-a1f9-429d-8680-9b297b39f3f4\") " pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:41:51.351037 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.351011 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-74d7868b9-gfcd7"] Apr 19 12:41:51.353224 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:41:51.353196 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b21855f_f05c_4f87_b9ae_7c8dec4de574.slice/crio-36e4e2a208f21de99bb3eb05bb66020abc6a6fdf3f5078d2369e3c4874ea6e09 WatchSource:0}: Error finding container 36e4e2a208f21de99bb3eb05bb66020abc6a6fdf3f5078d2369e3c4874ea6e09: Status 404 returned error can't find the container with id 36e4e2a208f21de99bb3eb05bb66020abc6a6fdf3f5078d2369e3c4874ea6e09 Apr 19 12:41:51.373826 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.373802 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:41:51.493130 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:51.493018 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-d6fc9457d-vwlj9"] Apr 19 12:41:51.495686 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:41:51.495656 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ca51f1_a1f9_429d_8680_9b297b39f3f4.slice/crio-a168cc7b18f4f44843b16c31c04a71ce0070fbb17b241e718ec7f32e99e587f8 WatchSource:0}: Error finding container a168cc7b18f4f44843b16c31c04a71ce0070fbb17b241e718ec7f32e99e587f8: Status 404 returned error can't find the container with id a168cc7b18f4f44843b16c31c04a71ce0070fbb17b241e718ec7f32e99e587f8 Apr 19 12:41:52.059908 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:52.059833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" event={"ID":"c4ca51f1-a1f9-429d-8680-9b297b39f3f4","Type":"ContainerStarted","Data":"a168cc7b18f4f44843b16c31c04a71ce0070fbb17b241e718ec7f32e99e587f8"} Apr 19 12:41:52.061959 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:52.061838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74d7868b9-gfcd7" event={"ID":"1b21855f-f05c-4f87-b9ae-7c8dec4de574","Type":"ContainerStarted","Data":"36e4e2a208f21de99bb3eb05bb66020abc6a6fdf3f5078d2369e3c4874ea6e09"} Apr 19 12:41:52.064466 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:52.064415 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" event={"ID":"ab717e66-3cb9-4456-a246-5b46ce0a3d25","Type":"ContainerStarted","Data":"04a422a8abd19b6effacc3debcb03e11d549dc2caa581d5fccd0a5bf1477c465"} Apr 19 12:41:55.075340 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.075249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" event={"ID":"ab717e66-3cb9-4456-a246-5b46ce0a3d25","Type":"ContainerStarted","Data":"2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868"} Apr 19 12:41:55.075340 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.075310 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" Apr 19 12:41:55.075868 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.075327 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" podUID="ab717e66-3cb9-4456-a246-5b46ce0a3d25" containerName="manager" containerID="cri-o://2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868" gracePeriod=10 Apr 19 12:41:55.076705 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.076680 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" event={"ID":"c4ca51f1-a1f9-429d-8680-9b297b39f3f4","Type":"ContainerStarted","Data":"3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855"} Apr 19 12:41:55.076857 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.076728 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:41:55.077934 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.077914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74d7868b9-gfcd7" event={"ID":"1b21855f-f05c-4f87-b9ae-7c8dec4de574","Type":"ContainerStarted","Data":"367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46"} Apr 19 12:41:55.078056 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.077995 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:41:55.092335 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.091954 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" podStartSLOduration=1.382868816 podStartE2EDuration="5.091931803s" podCreationTimestamp="2026-04-19 12:41:50 +0000 UTC" firstStartedPulling="2026-04-19 12:41:51.158823159 +0000 UTC m=+674.676980770" lastFinishedPulling="2026-04-19 12:41:54.867886166 +0000 UTC m=+678.386043757" observedRunningTime="2026-04-19 12:41:55.089965544 +0000 UTC m=+678.608123173" watchObservedRunningTime="2026-04-19 12:41:55.091931803 +0000 UTC m=+678.610089417" Apr 19 12:41:55.109376 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.109325 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-74d7868b9-gfcd7" podStartSLOduration=1.595629398 podStartE2EDuration="5.109311498s" podCreationTimestamp="2026-04-19 12:41:50 +0000 UTC" firstStartedPulling="2026-04-19 12:41:51.354599236 +0000 UTC m=+674.872756829" lastFinishedPulling="2026-04-19 12:41:54.868281339 +0000 UTC m=+678.386438929" observedRunningTime="2026-04-19 12:41:55.108985224 +0000 UTC m=+678.627142863" watchObservedRunningTime="2026-04-19 12:41:55.109311498 +0000 UTC m=+678.627469127" Apr 19 12:41:55.123844 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.123803 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" podStartSLOduration=0.742870171 podStartE2EDuration="4.123789082s" podCreationTimestamp="2026-04-19 12:41:51 +0000 UTC" firstStartedPulling="2026-04-19 12:41:51.496968463 +0000 UTC m=+675.015126054" lastFinishedPulling="2026-04-19 12:41:54.877887363 +0000 UTC m=+678.396044965" observedRunningTime="2026-04-19 12:41:55.122995788 +0000 UTC m=+678.641153401" watchObservedRunningTime="2026-04-19 12:41:55.123789082 +0000 UTC m=+678.641946695" Apr 19 12:41:55.320822 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.320769 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" Apr 19 12:41:55.381800 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.381726 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtmd6\" (UniqueName: \"kubernetes.io/projected/ab717e66-3cb9-4456-a246-5b46ce0a3d25-kube-api-access-gtmd6\") pod \"ab717e66-3cb9-4456-a246-5b46ce0a3d25\" (UID: \"ab717e66-3cb9-4456-a246-5b46ce0a3d25\") " Apr 19 12:41:55.384271 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.384236 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab717e66-3cb9-4456-a246-5b46ce0a3d25-kube-api-access-gtmd6" (OuterVolumeSpecName: "kube-api-access-gtmd6") pod "ab717e66-3cb9-4456-a246-5b46ce0a3d25" (UID: "ab717e66-3cb9-4456-a246-5b46ce0a3d25"). InnerVolumeSpecName "kube-api-access-gtmd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:41:55.482923 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:55.482885 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gtmd6\" (UniqueName: \"kubernetes.io/projected/ab717e66-3cb9-4456-a246-5b46ce0a3d25-kube-api-access-gtmd6\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:41:56.082293 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.082254 2578 generic.go:358] "Generic (PLEG): container finished" podID="ab717e66-3cb9-4456-a246-5b46ce0a3d25" containerID="2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868" exitCode=0 Apr 19 12:41:56.082765 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.082307 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" Apr 19 12:41:56.082765 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.082333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" event={"ID":"ab717e66-3cb9-4456-a246-5b46ce0a3d25","Type":"ContainerDied","Data":"2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868"} Apr 19 12:41:56.082765 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.082368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-twmlt" event={"ID":"ab717e66-3cb9-4456-a246-5b46ce0a3d25","Type":"ContainerDied","Data":"04a422a8abd19b6effacc3debcb03e11d549dc2caa581d5fccd0a5bf1477c465"} Apr 19 12:41:56.082765 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.082401 2578 scope.go:117] "RemoveContainer" containerID="2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868" Apr 19 12:41:56.091273 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.091256 2578 scope.go:117] "RemoveContainer" containerID="2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868" Apr 19 12:41:56.091553 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:41:56.091532 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868\": container with ID starting with 2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868 not found: ID does not exist" containerID="2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868" Apr 19 12:41:56.091616 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.091562 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868"} err="failed to get container status \"2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868\": rpc error: code = NotFound desc = could not find container \"2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868\": container with ID starting with 2e3d02056cfbc5319eb771ed3cb48f5087ccfb2fa3425e364ec7f81231030868 not found: ID does not exist" Apr 19 12:41:56.102992 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.102966 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-twmlt"] Apr 19 12:41:56.106950 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.106928 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-twmlt"] Apr 19 12:41:56.895709 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.895661 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-859db5f6bd-q9fnw"] Apr 19 12:41:56.896046 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.896032 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab717e66-3cb9-4456-a246-5b46ce0a3d25" containerName="manager" Apr 19 12:41:56.896091 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.896047 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab717e66-3cb9-4456-a246-5b46ce0a3d25" containerName="manager" Apr 19 12:41:56.896142 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.896113 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab717e66-3cb9-4456-a246-5b46ce0a3d25" containerName="manager" Apr 19 12:41:56.900109 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.900094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:56.902272 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.902250 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 19 12:41:56.902389 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.902282 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bbg64\"" Apr 19 12:41:56.902389 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.902288 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 19 12:41:56.906630 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.906608 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-859db5f6bd-q9fnw"] Apr 19 12:41:56.993930 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.993903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a57969b3-66f7-47eb-a4e0-ad7425e59979-maas-api-tls\") pod \"maas-api-859db5f6bd-q9fnw\" (UID: \"a57969b3-66f7-47eb-a4e0-ad7425e59979\") " pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:56.994046 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:56.993971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbmd9\" (UniqueName: \"kubernetes.io/projected/a57969b3-66f7-47eb-a4e0-ad7425e59979-kube-api-access-vbmd9\") pod \"maas-api-859db5f6bd-q9fnw\" (UID: \"a57969b3-66f7-47eb-a4e0-ad7425e59979\") " pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:57.063683 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:57.063652 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab717e66-3cb9-4456-a246-5b46ce0a3d25" path="/var/lib/kubelet/pods/ab717e66-3cb9-4456-a246-5b46ce0a3d25/volumes" Apr 19 12:41:57.094524 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:57.094499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbmd9\" (UniqueName: \"kubernetes.io/projected/a57969b3-66f7-47eb-a4e0-ad7425e59979-kube-api-access-vbmd9\") pod \"maas-api-859db5f6bd-q9fnw\" (UID: \"a57969b3-66f7-47eb-a4e0-ad7425e59979\") " pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:57.094857 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:57.094539 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a57969b3-66f7-47eb-a4e0-ad7425e59979-maas-api-tls\") pod \"maas-api-859db5f6bd-q9fnw\" (UID: \"a57969b3-66f7-47eb-a4e0-ad7425e59979\") " pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:57.097016 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:57.096994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a57969b3-66f7-47eb-a4e0-ad7425e59979-maas-api-tls\") pod \"maas-api-859db5f6bd-q9fnw\" (UID: \"a57969b3-66f7-47eb-a4e0-ad7425e59979\") " pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:57.102171 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:57.102138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbmd9\" (UniqueName: \"kubernetes.io/projected/a57969b3-66f7-47eb-a4e0-ad7425e59979-kube-api-access-vbmd9\") pod \"maas-api-859db5f6bd-q9fnw\" (UID: \"a57969b3-66f7-47eb-a4e0-ad7425e59979\") " pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:57.211615 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:57.211580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:57.352576 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:57.352404 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-859db5f6bd-q9fnw"] Apr 19 12:41:57.354649 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:41:57.354619 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57969b3_66f7_47eb_a4e0_ad7425e59979.slice/crio-b8b89e9152b0fd19a05f05d3e1b7de6ae8893601b12532493183442c725dd2c1 WatchSource:0}: Error finding container b8b89e9152b0fd19a05f05d3e1b7de6ae8893601b12532493183442c725dd2c1: Status 404 returned error can't find the container with id b8b89e9152b0fd19a05f05d3e1b7de6ae8893601b12532493183442c725dd2c1 Apr 19 12:41:58.089802 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:58.089768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-859db5f6bd-q9fnw" event={"ID":"a57969b3-66f7-47eb-a4e0-ad7425e59979","Type":"ContainerStarted","Data":"b8b89e9152b0fd19a05f05d3e1b7de6ae8893601b12532493183442c725dd2c1"} Apr 19 12:41:59.094621 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:59.094580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-859db5f6bd-q9fnw" event={"ID":"a57969b3-66f7-47eb-a4e0-ad7425e59979","Type":"ContainerStarted","Data":"8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a"} Apr 19 12:41:59.095036 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:59.094690 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:41:59.109636 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:41:59.109596 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-859db5f6bd-q9fnw" podStartSLOduration=1.591410853 podStartE2EDuration="3.109583123s" podCreationTimestamp="2026-04-19 12:41:56 +0000 UTC" firstStartedPulling="2026-04-19 12:41:57.355978757 +0000 UTC m=+680.874136349" lastFinishedPulling="2026-04-19 12:41:58.874151025 +0000 UTC m=+682.392308619" observedRunningTime="2026-04-19 12:41:59.108567679 +0000 UTC m=+682.626725302" watchObservedRunningTime="2026-04-19 12:41:59.109583123 +0000 UTC m=+682.627740751" Apr 19 12:42:05.103052 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:05.103027 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:42:06.088258 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:06.088220 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:42:06.088640 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:06.088620 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:42:06.137551 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:06.137498 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-74d7868b9-gfcd7"] Apr 19 12:42:06.137960 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:06.137747 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-74d7868b9-gfcd7" podUID="1b21855f-f05c-4f87-b9ae-7c8dec4de574" containerName="manager" containerID="cri-o://367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46" gracePeriod=10 Apr 19 12:42:06.371796 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:06.371773 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:42:06.470703 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:06.470659 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll96x\" (UniqueName: \"kubernetes.io/projected/1b21855f-f05c-4f87-b9ae-7c8dec4de574-kube-api-access-ll96x\") pod \"1b21855f-f05c-4f87-b9ae-7c8dec4de574\" (UID: \"1b21855f-f05c-4f87-b9ae-7c8dec4de574\") " Apr 19 12:42:06.473028 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:06.472995 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b21855f-f05c-4f87-b9ae-7c8dec4de574-kube-api-access-ll96x" (OuterVolumeSpecName: "kube-api-access-ll96x") pod "1b21855f-f05c-4f87-b9ae-7c8dec4de574" (UID: "1b21855f-f05c-4f87-b9ae-7c8dec4de574"). InnerVolumeSpecName "kube-api-access-ll96x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:42:06.571178 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:06.571152 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ll96x\" (UniqueName: \"kubernetes.io/projected/1b21855f-f05c-4f87-b9ae-7c8dec4de574-kube-api-access-ll96x\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:42:07.119842 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.119809 2578 generic.go:358] "Generic (PLEG): container finished" podID="1b21855f-f05c-4f87-b9ae-7c8dec4de574" containerID="367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46" exitCode=0 Apr 19 12:42:07.119979 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.119874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74d7868b9-gfcd7" event={"ID":"1b21855f-f05c-4f87-b9ae-7c8dec4de574","Type":"ContainerDied","Data":"367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46"} Apr 19 12:42:07.119979 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.119877 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74d7868b9-gfcd7" Apr 19 12:42:07.119979 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.119897 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74d7868b9-gfcd7" event={"ID":"1b21855f-f05c-4f87-b9ae-7c8dec4de574","Type":"ContainerDied","Data":"36e4e2a208f21de99bb3eb05bb66020abc6a6fdf3f5078d2369e3c4874ea6e09"} Apr 19 12:42:07.119979 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.119921 2578 scope.go:117] "RemoveContainer" containerID="367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46" Apr 19 12:42:07.127695 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.127680 2578 scope.go:117] "RemoveContainer" containerID="367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46" Apr 19 12:42:07.127936 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:42:07.127918 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46\": container with ID starting with 367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46 not found: ID does not exist" containerID="367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46" Apr 19 12:42:07.127999 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.127948 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46"} err="failed to get container status \"367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46\": rpc error: code = NotFound desc = could not find container \"367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46\": container with ID starting with 367e18c356b247e65633760f7b11df781ed293c9d786eff36d1ee4120ddb3b46 not found: ID does not exist" Apr 19 12:42:07.133900 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.133878 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-74d7868b9-gfcd7"] Apr 19 12:42:07.137066 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:07.137046 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-74d7868b9-gfcd7"] Apr 19 12:42:09.063349 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:09.063316 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b21855f-f05c-4f87-b9ae-7c8dec4de574" path="/var/lib/kubelet/pods/1b21855f-f05c-4f87-b9ae-7c8dec4de574/volumes" Apr 19 12:42:20.746871 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:20.746836 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-d6fc9457d-vwlj9"] Apr 19 12:42:20.747262 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:20.747111 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" podUID="c4ca51f1-a1f9-429d-8680-9b297b39f3f4" containerName="manager" containerID="cri-o://3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855" gracePeriod=10 Apr 19 12:42:20.982819 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:20.982798 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:42:21.079138 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.079072 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48b59\" (UniqueName: \"kubernetes.io/projected/c4ca51f1-a1f9-429d-8680-9b297b39f3f4-kube-api-access-48b59\") pod \"c4ca51f1-a1f9-429d-8680-9b297b39f3f4\" (UID: \"c4ca51f1-a1f9-429d-8680-9b297b39f3f4\") " Apr 19 12:42:21.081140 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.081111 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ca51f1-a1f9-429d-8680-9b297b39f3f4-kube-api-access-48b59" (OuterVolumeSpecName: "kube-api-access-48b59") pod "c4ca51f1-a1f9-429d-8680-9b297b39f3f4" (UID: "c4ca51f1-a1f9-429d-8680-9b297b39f3f4"). InnerVolumeSpecName "kube-api-access-48b59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:42:21.169064 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.169030 2578 generic.go:358] "Generic (PLEG): container finished" podID="c4ca51f1-a1f9-429d-8680-9b297b39f3f4" containerID="3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855" exitCode=0 Apr 19 12:42:21.169202 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.169087 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" event={"ID":"c4ca51f1-a1f9-429d-8680-9b297b39f3f4","Type":"ContainerDied","Data":"3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855"} Apr 19 12:42:21.169202 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.169100 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" Apr 19 12:42:21.169202 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.169121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d6fc9457d-vwlj9" event={"ID":"c4ca51f1-a1f9-429d-8680-9b297b39f3f4","Type":"ContainerDied","Data":"a168cc7b18f4f44843b16c31c04a71ce0070fbb17b241e718ec7f32e99e587f8"} Apr 19 12:42:21.169202 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.169139 2578 scope.go:117] "RemoveContainer" containerID="3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855" Apr 19 12:42:21.179275 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.179260 2578 scope.go:117] "RemoveContainer" containerID="3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855" Apr 19 12:42:21.179644 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:42:21.179619 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855\": container with ID starting with 3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855 not found: ID does not exist" containerID="3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855" Apr 19 12:42:21.179739 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.179648 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-48b59\" (UniqueName: \"kubernetes.io/projected/c4ca51f1-a1f9-429d-8680-9b297b39f3f4-kube-api-access-48b59\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:42:21.179739 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.179654 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855"} err="failed to get container status \"3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855\": rpc error: code = NotFound desc = could not find container \"3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855\": container with ID starting with 3ae241741c305fc0539ce15511af14be86e0b5d3b950391f2ae2bf202700f855 not found: ID does not exist" Apr 19 12:42:21.191219 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.191194 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-d6fc9457d-vwlj9"] Apr 19 12:42:21.196058 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:21.196038 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-d6fc9457d-vwlj9"] Apr 19 12:42:23.063548 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:23.063520 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ca51f1-a1f9-429d-8680-9b297b39f3f4" path="/var/lib/kubelet/pods/c4ca51f1-a1f9-429d-8680-9b297b39f3f4/volumes" Apr 19 12:42:27.114467 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.114386 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-699df8bd8f-7wtll"] Apr 19 12:42:27.114989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.114720 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4ca51f1-a1f9-429d-8680-9b297b39f3f4" containerName="manager" Apr 19 12:42:27.114989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.114733 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ca51f1-a1f9-429d-8680-9b297b39f3f4" containerName="manager" Apr 19 12:42:27.114989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.114744 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b21855f-f05c-4f87-b9ae-7c8dec4de574" containerName="manager" Apr 19 12:42:27.114989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.114749 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b21855f-f05c-4f87-b9ae-7c8dec4de574" containerName="manager" Apr 19 12:42:27.114989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.114802 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4ca51f1-a1f9-429d-8680-9b297b39f3f4" containerName="manager" Apr 19 12:42:27.114989 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.114817 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b21855f-f05c-4f87-b9ae-7c8dec4de574" containerName="manager" Apr 19 12:42:27.118171 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.118156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:27.125746 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.125721 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-699df8bd8f-7wtll"] Apr 19 12:42:27.225821 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.225791 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnmqx\" (UniqueName: \"kubernetes.io/projected/c0d0f8be-9607-444d-9aa2-a79ee5318c2a-kube-api-access-wnmqx\") pod \"maas-api-699df8bd8f-7wtll\" (UID: \"c0d0f8be-9607-444d-9aa2-a79ee5318c2a\") " pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:27.225969 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.225845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c0d0f8be-9607-444d-9aa2-a79ee5318c2a-maas-api-tls\") pod \"maas-api-699df8bd8f-7wtll\" (UID: \"c0d0f8be-9607-444d-9aa2-a79ee5318c2a\") " pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:27.326537 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.326502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnmqx\" (UniqueName: \"kubernetes.io/projected/c0d0f8be-9607-444d-9aa2-a79ee5318c2a-kube-api-access-wnmqx\") pod \"maas-api-699df8bd8f-7wtll\" (UID: \"c0d0f8be-9607-444d-9aa2-a79ee5318c2a\") " pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:27.326705 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.326560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c0d0f8be-9607-444d-9aa2-a79ee5318c2a-maas-api-tls\") pod \"maas-api-699df8bd8f-7wtll\" (UID: \"c0d0f8be-9607-444d-9aa2-a79ee5318c2a\") " pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:27.329199 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.329169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c0d0f8be-9607-444d-9aa2-a79ee5318c2a-maas-api-tls\") pod \"maas-api-699df8bd8f-7wtll\" (UID: \"c0d0f8be-9607-444d-9aa2-a79ee5318c2a\") " pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:27.333993 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.333968 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnmqx\" (UniqueName: \"kubernetes.io/projected/c0d0f8be-9607-444d-9aa2-a79ee5318c2a-kube-api-access-wnmqx\") pod \"maas-api-699df8bd8f-7wtll\" (UID: \"c0d0f8be-9607-444d-9aa2-a79ee5318c2a\") " pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:27.428288 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.428257 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:27.550887 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:27.550858 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-699df8bd8f-7wtll"] Apr 19 12:42:27.554901 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:42:27.554870 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0d0f8be_9607_444d_9aa2_a79ee5318c2a.slice/crio-56ace16b923e3426799f454c694129708ce25f84607d1c2643c6856557058d50 WatchSource:0}: Error finding container 56ace16b923e3426799f454c694129708ce25f84607d1c2643c6856557058d50: Status 404 returned error can't find the container with id 56ace16b923e3426799f454c694129708ce25f84607d1c2643c6856557058d50 Apr 19 12:42:28.192347 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:28.192310 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-699df8bd8f-7wtll" event={"ID":"c0d0f8be-9607-444d-9aa2-a79ee5318c2a","Type":"ContainerStarted","Data":"56ace16b923e3426799f454c694129708ce25f84607d1c2643c6856557058d50"} Apr 19 12:42:29.197150 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:29.197118 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-699df8bd8f-7wtll" event={"ID":"c0d0f8be-9607-444d-9aa2-a79ee5318c2a","Type":"ContainerStarted","Data":"5576e8c695bef7af743b5116bb94b2f433767f2f44be99ea013ffc5d95068faa"} Apr 19 12:42:29.197505 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:29.197249 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:29.215775 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:29.215732 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-699df8bd8f-7wtll" podStartSLOduration=0.885802465 podStartE2EDuration="2.215720005s" podCreationTimestamp="2026-04-19 12:42:27 +0000 UTC" firstStartedPulling="2026-04-19 12:42:27.556537739 +0000 UTC m=+711.074695333" lastFinishedPulling="2026-04-19 12:42:28.886455276 +0000 UTC m=+712.404612873" observedRunningTime="2026-04-19 12:42:29.21362525 +0000 UTC m=+712.731782862" watchObservedRunningTime="2026-04-19 12:42:29.215720005 +0000 UTC m=+712.733877646" Apr 19 12:42:35.206216 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.206188 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-699df8bd8f-7wtll" Apr 19 12:42:35.245657 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.245629 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-859db5f6bd-q9fnw"] Apr 19 12:42:35.245940 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.245905 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-859db5f6bd-q9fnw" podUID="a57969b3-66f7-47eb-a4e0-ad7425e59979" containerName="maas-api" containerID="cri-o://8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a" gracePeriod=30 Apr 19 12:42:35.490596 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.490573 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:42:35.587037 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.587004 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a57969b3-66f7-47eb-a4e0-ad7425e59979-maas-api-tls\") pod \"a57969b3-66f7-47eb-a4e0-ad7425e59979\" (UID: \"a57969b3-66f7-47eb-a4e0-ad7425e59979\") " Apr 19 12:42:35.587207 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.587079 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbmd9\" (UniqueName: \"kubernetes.io/projected/a57969b3-66f7-47eb-a4e0-ad7425e59979-kube-api-access-vbmd9\") pod \"a57969b3-66f7-47eb-a4e0-ad7425e59979\" (UID: \"a57969b3-66f7-47eb-a4e0-ad7425e59979\") " Apr 19 12:42:35.589147 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.589119 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57969b3-66f7-47eb-a4e0-ad7425e59979-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "a57969b3-66f7-47eb-a4e0-ad7425e59979" (UID: "a57969b3-66f7-47eb-a4e0-ad7425e59979"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:42:35.589244 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.589224 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57969b3-66f7-47eb-a4e0-ad7425e59979-kube-api-access-vbmd9" (OuterVolumeSpecName: "kube-api-access-vbmd9") pod "a57969b3-66f7-47eb-a4e0-ad7425e59979" (UID: "a57969b3-66f7-47eb-a4e0-ad7425e59979"). InnerVolumeSpecName "kube-api-access-vbmd9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:42:35.687647 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.687609 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vbmd9\" (UniqueName: \"kubernetes.io/projected/a57969b3-66f7-47eb-a4e0-ad7425e59979-kube-api-access-vbmd9\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:42:35.687647 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:35.687647 2578 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a57969b3-66f7-47eb-a4e0-ad7425e59979-maas-api-tls\") on node \"ip-10-0-129-233.ec2.internal\" DevicePath \"\"" Apr 19 12:42:36.221972 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.221940 2578 generic.go:358] "Generic (PLEG): container finished" podID="a57969b3-66f7-47eb-a4e0-ad7425e59979" containerID="8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a" exitCode=0 Apr 19 12:42:36.221972 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.221974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-859db5f6bd-q9fnw" event={"ID":"a57969b3-66f7-47eb-a4e0-ad7425e59979","Type":"ContainerDied","Data":"8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a"} Apr 19 12:42:36.222472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.221996 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-859db5f6bd-q9fnw" event={"ID":"a57969b3-66f7-47eb-a4e0-ad7425e59979","Type":"ContainerDied","Data":"b8b89e9152b0fd19a05f05d3e1b7de6ae8893601b12532493183442c725dd2c1"} Apr 19 12:42:36.222472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.222010 2578 scope.go:117] "RemoveContainer" containerID="8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a" Apr 19 12:42:36.222472 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.222011 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-859db5f6bd-q9fnw" Apr 19 12:42:36.230113 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.230091 2578 scope.go:117] "RemoveContainer" containerID="8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a" Apr 19 12:42:36.230330 ip-10-0-129-233 kubenswrapper[2578]: E0419 12:42:36.230312 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a\": container with ID starting with 8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a not found: ID does not exist" containerID="8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a" Apr 19 12:42:36.230387 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.230342 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a"} err="failed to get container status \"8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a\": rpc error: code = NotFound desc = could not find container \"8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a\": container with ID starting with 8da9db3bf234ae4642bd4543e0deb4f45fe0cd34bd9fe7a32541c8f274ab948a not found: ID does not exist" Apr 19 12:42:36.242384 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.242360 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-859db5f6bd-q9fnw"] Apr 19 12:42:36.246469 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:36.246447 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-859db5f6bd-q9fnw"] Apr 19 12:42:37.064912 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:37.064875 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a57969b3-66f7-47eb-a4e0-ad7425e59979" path="/var/lib/kubelet/pods/a57969b3-66f7-47eb-a4e0-ad7425e59979/volumes" Apr 19 12:42:38.821325 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.821294 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z"] Apr 19 12:42:38.821697 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.821638 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a57969b3-66f7-47eb-a4e0-ad7425e59979" containerName="maas-api" Apr 19 12:42:38.821697 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.821651 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57969b3-66f7-47eb-a4e0-ad7425e59979" containerName="maas-api" Apr 19 12:42:38.821771 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.821725 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a57969b3-66f7-47eb-a4e0-ad7425e59979" containerName="maas-api" Apr 19 12:42:38.826524 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.826504 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:38.828948 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.828927 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 19 12:42:38.829878 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.829832 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 19 12:42:38.829878 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.829832 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-v7ph6\"" Apr 19 12:42:38.829878 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.829875 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 19 12:42:38.834180 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.834155 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z"] Apr 19 12:42:38.906617 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.906587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:38.906742 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.906629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/889cb5fd-5ac2-4c4c-981b-ff9960735918-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:38.906742 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.906647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:38.906742 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.906719 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:38.906934 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.906742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rl6\" (UniqueName: \"kubernetes.io/projected/889cb5fd-5ac2-4c4c-981b-ff9960735918-kube-api-access-54rl6\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:38.906934 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:38.906763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.007755 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.007724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.007886 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.007767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54rl6\" (UniqueName: \"kubernetes.io/projected/889cb5fd-5ac2-4c4c-981b-ff9960735918-kube-api-access-54rl6\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.007886 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.007803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.008006 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.007903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.008071 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.008044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/889cb5fd-5ac2-4c4c-981b-ff9960735918-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.008141 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.008071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.008203 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.008174 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.008329 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.008308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.008865 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.008840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.010347 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.010326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/889cb5fd-5ac2-4c4c-981b-ff9960735918-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.010587 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.010570 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/889cb5fd-5ac2-4c4c-981b-ff9960735918-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.014908 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.014887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rl6\" (UniqueName: \"kubernetes.io/projected/889cb5fd-5ac2-4c4c-981b-ff9960735918-kube-api-access-54rl6\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gtk4z\" (UID: \"889cb5fd-5ac2-4c4c-981b-ff9960735918\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.137043 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.136981 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:39.260701 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:39.260648 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z"] Apr 19 12:42:39.262926 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:42:39.262900 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod889cb5fd_5ac2_4c4c_981b_ff9960735918.slice/crio-99414d95f54ed5ba3caa5c7ad2f6dec4745b8c6dd16db9d92a0d9c458293ec0e WatchSource:0}: Error finding container 99414d95f54ed5ba3caa5c7ad2f6dec4745b8c6dd16db9d92a0d9c458293ec0e: Status 404 returned error can't find the container with id 99414d95f54ed5ba3caa5c7ad2f6dec4745b8c6dd16db9d92a0d9c458293ec0e Apr 19 12:42:40.236940 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:40.236903 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" event={"ID":"889cb5fd-5ac2-4c4c-981b-ff9960735918","Type":"ContainerStarted","Data":"99414d95f54ed5ba3caa5c7ad2f6dec4745b8c6dd16db9d92a0d9c458293ec0e"} Apr 19 12:42:46.260352 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:46.260318 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" event={"ID":"889cb5fd-5ac2-4c4c-981b-ff9960735918","Type":"ContainerStarted","Data":"7e657081c906131da109d76677945d1f2a45f03967a05aad365c62840f907278"} Apr 19 12:42:49.000385 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.000341 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t"] Apr 19 12:42:49.004078 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.004054 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.006401 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.006379 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 19 12:42:49.013600 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.013580 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t"] Apr 19 12:42:49.093890 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.093861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.093890 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.093900 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.094069 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.093952 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jd9v\" (UniqueName: \"kubernetes.io/projected/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-kube-api-access-2jd9v\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.094069 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.093983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.094069 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.094037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.094171 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.094093 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.195457 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.195405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.195457 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.195449 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.195707 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.195662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.195707 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.195701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jd9v\" (UniqueName: \"kubernetes.io/projected/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-kube-api-access-2jd9v\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.195804 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.195733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.195804 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.195786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.195983 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.195962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.196039 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.196015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.196178 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.196136 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.198366 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.198347 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.198560 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.198530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.207434 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.207410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jd9v\" (UniqueName: \"kubernetes.io/projected/14f7b98b-4373-4ec0-8a85-675fd3eb7f8e-kube-api-access-2jd9v\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t\" (UID: \"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.314584 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.314501 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:49.444966 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:49.444936 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t"] Apr 19 12:42:49.447516 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:42:49.447462 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14f7b98b_4373_4ec0_8a85_675fd3eb7f8e.slice/crio-09e8408a9701e4a0e191cf4cf467a3864a6f1b541c96647446d35f2d816b82ff WatchSource:0}: Error finding container 09e8408a9701e4a0e191cf4cf467a3864a6f1b541c96647446d35f2d816b82ff: Status 404 returned error can't find the container with id 09e8408a9701e4a0e191cf4cf467a3864a6f1b541c96647446d35f2d816b82ff Apr 19 12:42:50.276889 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:50.276852 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" event={"ID":"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e","Type":"ContainerStarted","Data":"6829590a9c0e599a77df96fe09a8d6c6a672335495aa1189f2ca017df0da9727"} Apr 19 12:42:50.276889 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:50.276889 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" event={"ID":"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e","Type":"ContainerStarted","Data":"09e8408a9701e4a0e191cf4cf467a3864a6f1b541c96647446d35f2d816b82ff"} Apr 19 12:42:51.281902 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:51.281873 2578 generic.go:358] "Generic (PLEG): container finished" podID="889cb5fd-5ac2-4c4c-981b-ff9960735918" containerID="7e657081c906131da109d76677945d1f2a45f03967a05aad365c62840f907278" exitCode=0 Apr 19 12:42:51.282352 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:51.281956 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" event={"ID":"889cb5fd-5ac2-4c4c-981b-ff9960735918","Type":"ContainerDied","Data":"7e657081c906131da109d76677945d1f2a45f03967a05aad365c62840f907278"} Apr 19 12:42:53.301158 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:53.301123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" event={"ID":"889cb5fd-5ac2-4c4c-981b-ff9960735918","Type":"ContainerStarted","Data":"82afe96a6fdfbf7374f42bfcddd92c2e46d5d2859df2f70075b4822bb8fafe3a"} Apr 19 12:42:53.301535 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:53.301345 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:42:53.318464 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:53.318380 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" podStartSLOduration=2.11476326 podStartE2EDuration="15.318366199s" podCreationTimestamp="2026-04-19 12:42:38 +0000 UTC" firstStartedPulling="2026-04-19 12:42:39.266383992 +0000 UTC m=+722.784541586" lastFinishedPulling="2026-04-19 12:42:52.469986932 +0000 UTC m=+735.988144525" observedRunningTime="2026-04-19 12:42:53.317618143 +0000 UTC m=+736.835775790" watchObservedRunningTime="2026-04-19 12:42:53.318366199 +0000 UTC m=+736.836523811" Apr 19 12:42:55.309115 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:55.309034 2578 generic.go:358] "Generic (PLEG): container finished" podID="14f7b98b-4373-4ec0-8a85-675fd3eb7f8e" containerID="6829590a9c0e599a77df96fe09a8d6c6a672335495aa1189f2ca017df0da9727" exitCode=0 Apr 19 12:42:55.309435 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:55.309106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" event={"ID":"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e","Type":"ContainerDied","Data":"6829590a9c0e599a77df96fe09a8d6c6a672335495aa1189f2ca017df0da9727"} Apr 19 12:42:56.314719 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:56.314686 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" event={"ID":"14f7b98b-4373-4ec0-8a85-675fd3eb7f8e","Type":"ContainerStarted","Data":"db227fa13bd62d2b9ce824da254adfdadffcdee7bd78186e5e778692dcaf213d"} Apr 19 12:42:56.315109 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:56.314905 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:42:56.332140 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:42:56.332067 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" podStartSLOduration=8.14105209 podStartE2EDuration="8.332052353s" podCreationTimestamp="2026-04-19 12:42:48 +0000 UTC" firstStartedPulling="2026-04-19 12:42:55.30975964 +0000 UTC m=+738.827917231" lastFinishedPulling="2026-04-19 12:42:55.500759902 +0000 UTC m=+739.018917494" observedRunningTime="2026-04-19 12:42:56.331190238 +0000 UTC m=+739.849347850" watchObservedRunningTime="2026-04-19 12:42:56.332052353 +0000 UTC m=+739.850209967" Apr 19 12:43:04.316887 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:04.316858 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gtk4z" Apr 19 12:43:07.330622 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:07.330589 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t" Apr 19 12:43:13.609921 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.609885 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4"] Apr 19 12:43:13.643505 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.643466 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4"] Apr 19 12:43:13.643651 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.643593 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.646138 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.646113 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 19 12:43:13.798766 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.798735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.798918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.798790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.798918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.798835 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqkw\" (UniqueName: \"kubernetes.io/projected/fe26e13e-a5d6-4eec-8240-045cfc6607d1-kube-api-access-6hqkw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.798918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.798861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.798918 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.798900 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.799082 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.798921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe26e13e-a5d6-4eec-8240-045cfc6607d1-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900170 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900170 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe26e13e-a5d6-4eec-8240-045cfc6607d1-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900355 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900355 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900355 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqkw\" (UniqueName: \"kubernetes.io/projected/fe26e13e-a5d6-4eec-8240-045cfc6607d1-kube-api-access-6hqkw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900355 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900627 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900771 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.900808 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.900755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.902535 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.902516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe26e13e-a5d6-4eec-8240-045cfc6607d1-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.902780 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.902762 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe26e13e-a5d6-4eec-8240-045cfc6607d1-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.907908 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.907885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqkw\" (UniqueName: \"kubernetes.io/projected/fe26e13e-a5d6-4eec-8240-045cfc6607d1-kube-api-access-6hqkw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4\" (UID: \"fe26e13e-a5d6-4eec-8240-045cfc6607d1\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:13.953060 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:13.953038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:14.285238 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:14.285211 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4"] Apr 19 12:43:14.287879 ip-10-0-129-233 kubenswrapper[2578]: W0419 12:43:14.287846 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe26e13e_a5d6_4eec_8240_045cfc6607d1.slice/crio-9b7c4a822f46739fc05b72a4abca2198562f1db6c9219afd40232aec2866096e WatchSource:0}: Error finding container 9b7c4a822f46739fc05b72a4abca2198562f1db6c9219afd40232aec2866096e: Status 404 returned error can't find the container with id 9b7c4a822f46739fc05b72a4abca2198562f1db6c9219afd40232aec2866096e Apr 19 12:43:14.289717 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:14.289701 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:43:14.371396 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:14.371355 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" event={"ID":"fe26e13e-a5d6-4eec-8240-045cfc6607d1","Type":"ContainerStarted","Data":"6f6df21863dade5b7a0fe5b3d868a61abea5efc8f91f8b51e84499c121530cbe"} Apr 19 12:43:14.371537 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:14.371403 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" event={"ID":"fe26e13e-a5d6-4eec-8240-045cfc6607d1","Type":"ContainerStarted","Data":"9b7c4a822f46739fc05b72a4abca2198562f1db6c9219afd40232aec2866096e"} Apr 19 12:43:20.394306 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:20.394276 2578 generic.go:358] "Generic (PLEG): container finished" podID="fe26e13e-a5d6-4eec-8240-045cfc6607d1" containerID="6f6df21863dade5b7a0fe5b3d868a61abea5efc8f91f8b51e84499c121530cbe" exitCode=0 Apr 19 12:43:20.394701 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:20.394340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" event={"ID":"fe26e13e-a5d6-4eec-8240-045cfc6607d1","Type":"ContainerDied","Data":"6f6df21863dade5b7a0fe5b3d868a61abea5efc8f91f8b51e84499c121530cbe"} Apr 19 12:43:21.399714 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:21.399669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" event={"ID":"fe26e13e-a5d6-4eec-8240-045cfc6607d1","Type":"ContainerStarted","Data":"b4afeb5a1da01ae5253153f9150c9b2790360547cd59d1692e0e8ddf7c7cd742"} Apr 19 12:43:21.400097 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:21.399958 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:43:21.417629 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:21.417573 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" podStartSLOduration=8.24573252 podStartE2EDuration="8.417559785s" podCreationTimestamp="2026-04-19 12:43:13 +0000 UTC" firstStartedPulling="2026-04-19 12:43:20.394937228 +0000 UTC m=+763.913094819" lastFinishedPulling="2026-04-19 12:43:20.566764479 +0000 UTC m=+764.084922084" observedRunningTime="2026-04-19 12:43:21.416547032 +0000 UTC m=+764.934704888" watchObservedRunningTime="2026-04-19 12:43:21.417559785 +0000 UTC m=+764.935717445" Apr 19 12:43:32.416882 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:43:32.416849 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4" Apr 19 12:45:37.403021 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:45:37.402933 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:45:37.403432 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:45:37.403223 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:50:37.424655 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:50:37.424624 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:50:37.425660 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:50:37.425628 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:55:37.446389 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:55:37.446359 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 12:55:37.449047 ip-10-0-129-233 kubenswrapper[2578]: I0419 12:55:37.448665 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 13:00:37.468624 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:00:37.468517 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 13:00:37.474580 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:00:37.471865 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 13:00:41.681570 ip-10-0-129-233 kubenswrapper[2578]: E0419 13:00:41.681539 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Apr 19 13:05:37.491419 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:05:37.491315 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 13:05:37.495441 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:05:37.495424 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 13:06:34.714392 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:34.714313 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-7j52c_3792cd68-3123-4e65-bfd6-a57c6528d028/manager/0.log" Apr 19 13:06:34.820504 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:34.820435 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-699df8bd8f-7wtll_c0d0f8be-9607-444d-9aa2-a79ee5318c2a/maas-api/0.log" Apr 19 13:06:35.053314 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:35.053221 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-2j694_e46ceb33-f1df-4349-8d92-6c1636f2fb98/manager/1.log" Apr 19 13:06:35.160975 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:35.160951 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9ff869b6b-c4c6d_ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a/manager/0.log" Apr 19 13:06:35.471533 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:35.471509 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-8zr62_43aeb4ec-2438-4029-98ad-416497c39e00/postgres/0.log" Apr 19 13:06:37.500268 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:37.500240 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-kj6nf_70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4/manager/0.log" Apr 19 13:06:37.945336 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:37.945308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8772d_94ddfebb-3e22-4687-ac70-95168adb6c6c/discovery/0.log" Apr 19 13:06:38.151678 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:38.151638 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-874cdfcc7-g9xcq_c1bac5ee-efd8-4476-900d-964769a87ad2/kube-auth-proxy/0.log" Apr 19 13:06:38.792879 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:38.792834 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4_fe26e13e-a5d6-4eec-8240-045cfc6607d1/storage-initializer/0.log" Apr 19 13:06:38.799314 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:38.799292 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-th4j4_fe26e13e-a5d6-4eec-8240-045cfc6607d1/main/0.log" Apr 19 13:06:38.903422 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:38.903386 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-gtk4z_889cb5fd-5ac2-4c4c-981b-ff9960735918/storage-initializer/0.log" Apr 19 13:06:38.910211 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:38.910188 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-gtk4z_889cb5fd-5ac2-4c4c-981b-ff9960735918/main/0.log" Apr 19 13:06:39.129395 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:39.129309 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t_14f7b98b-4373-4ec0-8a85-675fd3eb7f8e/storage-initializer/0.log" Apr 19 13:06:39.135788 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:39.135768 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-q8q9t_14f7b98b-4373-4ec0-8a85-675fd3eb7f8e/main/0.log" Apr 19 13:06:45.892734 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:45.892708 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bjv2v_635b706c-4824-43f2-9e8f-fed36e897e9b/global-pull-secret-syncer/0.log" Apr 19 13:06:46.046080 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:46.046051 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q6bkc_ea2e728c-ff62-44d4-999e-021181a80e96/konnectivity-agent/0.log" Apr 19 13:06:46.090689 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:46.090663 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-233.ec2.internal_b2951febf619ea3eaaaa44d21e7bf15f/haproxy/0.log" Apr 19 13:06:50.600039 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:50.600008 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-kj6nf_70dd7ed8-aaf8-42f5-9d0e-9f87d1c245a4/manager/0.log" Apr 19 13:06:52.254355 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:52.254330 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6754c8fdfb-2h4xs_873ac349-1422-440d-a10c-599af83ba311/metrics-server/0.log" Apr 19 13:06:52.465571 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:52.465545 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zn87z_250f49a4-ff50-4180-85c9-c0a23c798518/node-exporter/0.log" Apr 19 13:06:52.489786 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:52.489756 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zn87z_250f49a4-ff50-4180-85c9-c0a23c798518/kube-rbac-proxy/0.log" Apr 19 13:06:52.510321 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:52.510261 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zn87z_250f49a4-ff50-4180-85c9-c0a23c798518/init-textfile/0.log" Apr 19 13:06:54.734545 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.734509 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24"] Apr 19 13:06:54.738046 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.738025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.740258 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.740230 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5849\"/\"openshift-service-ca.crt\"" Apr 19 13:06:54.740371 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.740279 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5849\"/\"kube-root-ca.crt\"" Apr 19 13:06:54.741448 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.741419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f5849\"/\"default-dockercfg-fxnnq\"" Apr 19 13:06:54.746377 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.746355 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24"] Apr 19 13:06:54.856591 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.856564 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-lib-modules\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.856726 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.856606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-proc\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.856726 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.856640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-podres\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.856726 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.856657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-626jl\" (UniqueName: \"kubernetes.io/projected/a235f3f5-5178-4641-b81d-8b308a0550e5-kube-api-access-626jl\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.856726 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.856686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-sys\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957191 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-proc\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957314 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-podres\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957314 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-626jl\" (UniqueName: \"kubernetes.io/projected/a235f3f5-5178-4641-b81d-8b308a0550e5-kube-api-access-626jl\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957314 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-sys\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957314 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-proc\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957314 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-lib-modules\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957472 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-podres\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957472 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-sys\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.957472 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.957408 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a235f3f5-5178-4641-b81d-8b308a0550e5-lib-modules\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:54.964572 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:54.964551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-626jl\" (UniqueName: \"kubernetes.io/projected/a235f3f5-5178-4641-b81d-8b308a0550e5-kube-api-access-626jl\") pod \"perf-node-gather-daemonset-7vj24\" (UID: \"a235f3f5-5178-4641-b81d-8b308a0550e5\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:55.048319 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:55.048268 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:55.177767 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:55.177705 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24"] Apr 19 13:06:55.180039 ip-10-0-129-233 kubenswrapper[2578]: W0419 13:06:55.180009 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda235f3f5_5178_4641_b81d_8b308a0550e5.slice/crio-89b720e15390a5e1c94613d4d3c74a5160088eee89c66b6d1b3adaf3009ca45e WatchSource:0}: Error finding container 89b720e15390a5e1c94613d4d3c74a5160088eee89c66b6d1b3adaf3009ca45e: Status 404 returned error can't find the container with id 89b720e15390a5e1c94613d4d3c74a5160088eee89c66b6d1b3adaf3009ca45e Apr 19 13:06:55.181836 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:55.181814 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 13:06:55.972908 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:55.972869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" event={"ID":"a235f3f5-5178-4641-b81d-8b308a0550e5","Type":"ContainerStarted","Data":"7d0c9c73b2201f381d2ed589a1a58b21ec03bc7b3ecd1b5d6a8088a7b3ad8448"} Apr 19 13:06:55.972908 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:55.972904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" event={"ID":"a235f3f5-5178-4641-b81d-8b308a0550e5","Type":"ContainerStarted","Data":"89b720e15390a5e1c94613d4d3c74a5160088eee89c66b6d1b3adaf3009ca45e"} Apr 19 13:06:55.973386 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:55.972930 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:06:55.991104 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:55.991058 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" podStartSLOduration=1.9910462020000002 podStartE2EDuration="1.991046202s" podCreationTimestamp="2026-04-19 13:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 13:06:55.990558153 +0000 UTC m=+2179.508715796" watchObservedRunningTime="2026-04-19 13:06:55.991046202 +0000 UTC m=+2179.509203815" Apr 19 13:06:56.477366 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:56.477344 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wslbj_05871dc3-ae4d-416d-b447-072b85515564/dns/0.log" Apr 19 13:06:56.497061 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:56.497039 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wslbj_05871dc3-ae4d-416d-b447-072b85515564/kube-rbac-proxy/0.log" Apr 19 13:06:56.614303 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:56.614276 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wpnz9_f56e0b8d-b9f5-437d-95c9-cd46b8dbcea0/dns-node-resolver/0.log" Apr 19 13:06:57.031675 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:57.031630 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-76856dbc87-mlxnt_1f2405a0-05f1-4033-8bc1-a53d6643aa6c/registry/0.log" Apr 19 13:06:57.050296 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:57.050254 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fqzjh_5511d94c-29bb-45a0-b060-745261d9a2e8/node-ca/0.log" Apr 19 13:06:57.949054 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:57.949025 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8772d_94ddfebb-3e22-4687-ac70-95168adb6c6c/discovery/0.log" Apr 19 13:06:57.995072 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:57.995045 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-874cdfcc7-g9xcq_c1bac5ee-efd8-4476-900d-964769a87ad2/kube-auth-proxy/0.log" Apr 19 13:06:58.578177 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:58.578140 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r258l_126af27c-12ce-43aa-a67f-805e0e4b3a5a/serve-healthcheck-canary/0.log" Apr 19 13:06:59.034059 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:59.034027 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9phw_8083e678-d295-4963-917f-b040594707dd/kube-rbac-proxy/0.log" Apr 19 13:06:59.053981 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:59.053952 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9phw_8083e678-d295-4963-917f-b040594707dd/exporter/0.log" Apr 19 13:06:59.073817 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:06:59.073776 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9phw_8083e678-d295-4963-917f-b040594707dd/extractor/0.log" Apr 19 13:07:01.006418 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:01.006386 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-7j52c_3792cd68-3123-4e65-bfd6-a57c6528d028/manager/0.log" Apr 19 13:07:01.035847 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:01.035824 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-699df8bd8f-7wtll_c0d0f8be-9607-444d-9aa2-a79ee5318c2a/maas-api/0.log" Apr 19 13:07:01.106833 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:01.106809 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-2j694_e46ceb33-f1df-4349-8d92-6c1636f2fb98/manager/0.log" Apr 19 13:07:01.117017 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:01.116996 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-2j694_e46ceb33-f1df-4349-8d92-6c1636f2fb98/manager/1.log" Apr 19 13:07:01.141126 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:01.141106 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9ff869b6b-c4c6d_ea7e6b0e-1e17-4724-8d3b-6847c1c98f5a/manager/0.log" Apr 19 13:07:01.231792 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:01.231775 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-8zr62_43aeb4ec-2438-4029-98ad-416497c39e00/postgres/0.log" Apr 19 13:07:01.985970 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:01.985939 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-7vj24" Apr 19 13:07:08.009740 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.009710 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rzvv_c762d1dd-0bb7-4a1e-83a7-5a20dfd1674f/kube-multus/0.log" Apr 19 13:07:08.246545 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.246512 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mdpcd_59d9fb8b-c8ff-4890-ae51-0f7fa04e6865/kube-multus-additional-cni-plugins/0.log" Apr 19 13:07:08.285127 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.285063 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mdpcd_59d9fb8b-c8ff-4890-ae51-0f7fa04e6865/egress-router-binary-copy/0.log" Apr 19 13:07:08.324149 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.324125 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mdpcd_59d9fb8b-c8ff-4890-ae51-0f7fa04e6865/cni-plugins/0.log" Apr 19 13:07:08.366588 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.366562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mdpcd_59d9fb8b-c8ff-4890-ae51-0f7fa04e6865/bond-cni-plugin/0.log" Apr 19 13:07:08.405020 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.404998 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mdpcd_59d9fb8b-c8ff-4890-ae51-0f7fa04e6865/routeoverride-cni/0.log" Apr 19 13:07:08.444885 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.444862 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mdpcd_59d9fb8b-c8ff-4890-ae51-0f7fa04e6865/whereabouts-cni-bincopy/0.log" Apr 19 13:07:08.485110 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.485087 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mdpcd_59d9fb8b-c8ff-4890-ae51-0f7fa04e6865/whereabouts-cni/0.log" Apr 19 13:07:08.904613 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.904585 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7t9j2_46c7636d-9cd5-47c0-afaa-e58b27072e37/network-metrics-daemon/0.log" Apr 19 13:07:08.940410 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:08.940370 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7t9j2_46c7636d-9cd5-47c0-afaa-e58b27072e37/kube-rbac-proxy/0.log" Apr 19 13:07:09.882566 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:09.882540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-controller/0.log" Apr 19 13:07:09.901771 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:09.901734 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/0.log" Apr 19 13:07:09.911707 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:09.911683 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovn-acl-logging/1.log" Apr 19 13:07:09.929897 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:09.929876 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/kube-rbac-proxy-node/0.log" Apr 19 13:07:09.949607 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:09.949585 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/kube-rbac-proxy-ovn-metrics/0.log" Apr 19 13:07:09.967092 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:09.967068 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/northd/0.log" Apr 19 13:07:09.987268 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:09.987249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/nbdb/0.log" Apr 19 13:07:10.014764 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:10.014718 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/sbdb/0.log" Apr 19 13:07:10.111044 ip-10-0-129-233 kubenswrapper[2578]: I0419 13:07:10.111010 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jrwbt_dc60d29d-7874-4905-9075-ae159b1131a3/ovnkube-controller/0.log"