Apr 17 11:16:11.344695 ip-10-0-128-205 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:11.777212 ip-10-0-128-205 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:11.777212 ip-10-0-128-205 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:11.777212 ip-10-0-128-205 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:11.777212 ip-10-0-128-205 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:11.777212 ip-10-0-128-205 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:11.779356 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.779254 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:11.782803 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782784 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.782803 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782802 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.782803 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782807 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782812 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782816 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782820 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782824 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782828 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782832 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782835 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782839 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782844 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782848 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782852 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782855 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782859 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782863 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782867 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782870 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782874 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782878 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782882 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.782992 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782886 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782890 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782894 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782899 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782903 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782907 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782913 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782919 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782925 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782930 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782934 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782938 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782942 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782947 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782952 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782957 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782961 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782965 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782970 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782975 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.783808 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782979 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782983 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782987 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782991 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.782995 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783001 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783005 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783009 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783013 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783017 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783023 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783029 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783034 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783039 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783044 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783048 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783052 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783057 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783061 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783066 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.784557 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783071 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783075 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783079 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783084 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783089 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783093 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783099 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783104 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783108 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783112 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783116 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783120 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783124 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783131 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783135 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783140 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783144 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783148 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783152 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783156 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.785055 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783160 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783164 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783169 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783173 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783851 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783861 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783866 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783871 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783875 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783880 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783885 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783891 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783895 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783901 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783905 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783910 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783915 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783920 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783925 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783930 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.785553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783934 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783938 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783942 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783946 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783950 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783955 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783958 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783963 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783967 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783971 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783975 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783979 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783983 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783987 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783991 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783995 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.783999 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784003 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784008 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784012 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.786049 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784016 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784023 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784027 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784031 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784035 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784039 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784044 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784048 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784052 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784058 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784063 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784067 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784071 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784075 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784079 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784083 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784087 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784091 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784096 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.787030 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784100 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784104 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784108 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784112 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784116 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784121 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784125 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784129 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784133 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784137 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784141 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784145 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784149 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784153 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784157 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784161 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784165 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784169 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784174 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784178 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.787695 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784186 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784191 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784196 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784203 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784209 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784213 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784217 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784221 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784226 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784230 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.784234 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786004 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786020 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786032 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786038 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786045 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786050 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786057 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786069 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786074 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786083 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:11.788262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786089 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786095 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786100 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786105 2577 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786109 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786114 2577 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786119 2577 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786124 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786129 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786135 2577 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786140 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786145 2577 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786150 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786156 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786163 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786169 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786174 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786180 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786185 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786190 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786194 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786200 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786205 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786211 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786216 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:11.788962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786221 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786225 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786230 2577 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786235 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786243 2577 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786248 2577 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786254 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786259 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786264 2577 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786271 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786277 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786281 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786286 2577 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786291 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786296 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786300 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786304 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786309 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786314 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786319 2577 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786325 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786330 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786335 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786341 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786346 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:11.789661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786351 2577 flags.go:64] FLAG: --help="false" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786355 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-128-205.ec2.internal" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786377 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786382 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786387 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786393 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786398 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786403 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786408 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786412 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786416 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786421 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786427 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786433 2577 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786437 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786442 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786447 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786452 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786456 2577 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786461 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786465 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786470 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786480 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:11.790387 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786485 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786489 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786494 2577 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786500 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786506 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786510 2577 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786515 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786522 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786527 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786534 2577 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786539 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786543 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786548 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786554 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786558 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786563 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786568 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786579 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786584 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786589 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786593 2577 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786598 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786607 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786614 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:11.790953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786619 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786624 2577 flags.go:64] FLAG: --port="10250" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786629 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786633 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09d2f1f743e949497" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786638 2577 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786643 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786648 2577 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786653 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786657 2577 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786663 2577 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786668 2577 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786677 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786682 2577 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786688 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786693 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786698 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786703 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786708 2577 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786713 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786718 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786723 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786727 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786732 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786737 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786742 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786747 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:11.791559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786751 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786755 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786760 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786765 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786771 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786777 2577 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786782 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786790 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786794 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786799 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786805 2577 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786810 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786815 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786819 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786824 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786829 2577 flags.go:64] FLAG: --v="2" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786835 2577 flags.go:64] FLAG: --version="false" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786843 2577 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786850 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.786855 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787010 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787018 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787023 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787027 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.792211 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787033 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787037 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787042 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787046 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787050 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787055 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787059 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787064 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787069 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787074 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787078 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787083 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787087 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787092 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787097 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787100 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787104 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787108 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787112 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787117 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.792798 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787121 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787125 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787130 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787133 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787137 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787141 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787145 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787149 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787153 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787157 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787161 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787166 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787170 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787176 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787180 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787184 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787188 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787192 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787196 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787210 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.793349 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787215 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787219 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787223 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787229 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787235 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787240 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787245 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787250 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787254 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787259 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787263 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787268 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787274 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787279 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787284 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787289 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787293 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787297 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.793864 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787301 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787306 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787309 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787314 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787318 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787322 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787327 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787331 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787336 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787340 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787344 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787348 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787352 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787356 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787378 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787383 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787387 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787391 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787395 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.794304 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787399 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787404 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787408 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787412 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.787416 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.787886 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.794600 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.794615 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794661 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794666 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794669 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794672 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794675 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794678 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794681 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794683 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.794764 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794686 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794689 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794691 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794694 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794696 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794700 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794703 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794705 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794708 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794710 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794712 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794715 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794717 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794720 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794722 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794725 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794728 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794732 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794736 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.795192 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794739 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794742 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794745 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794748 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794751 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794754 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794757 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794759 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794762 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794765 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794767 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794770 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794773 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794775 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794778 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794781 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794784 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794786 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794789 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794792 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.795693 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794794 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794797 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794799 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794802 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794804 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794807 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794809 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794811 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794814 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794817 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794819 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794822 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794824 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794827 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794829 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794832 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794834 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794837 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794840 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794842 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.796184 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794845 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794847 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794849 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794852 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794854 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794857 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794859 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794861 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794864 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794868 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794872 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794875 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794880 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794883 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794885 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794887 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794890 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794892 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.796691 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794895 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.794900 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.794997 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795002 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795004 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795008 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795011 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795014 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795017 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795020 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795022 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795025 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795028 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795031 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795033 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795036 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.797113 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795038 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795041 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795043 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795045 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795048 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795050 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795052 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795055 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795057 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795059 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795062 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795065 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795067 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795069 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795072 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795074 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795076 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795079 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795081 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795083 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.797525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795086 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795088 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795090 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795093 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795095 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795098 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795100 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795103 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795105 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795108 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795110 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795112 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795115 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795117 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795120 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795122 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795124 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795127 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795129 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795132 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.798016 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795134 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795136 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795139 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795141 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795145 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795148 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795151 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795153 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795155 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795158 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795160 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795163 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795165 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795168 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795170 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795172 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795175 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795177 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795179 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.798553 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795182 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795185 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795187 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795189 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795192 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795194 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795197 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795208 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795213 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795217 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795219 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795222 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:11.795225 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.795230 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:11.799090 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.795909 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:11.799478 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.797864 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:11.799478 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.798776 2577 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:11.799478 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.798928 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:11.799845 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.799827 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:11.824165 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.824137 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:11.831451 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.831424 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:11.842651 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.842628 2577 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:11.848864 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.848848 2577 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:11.850168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.850154 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:11.854725 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.854706 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:11.857440 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.857411 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 e20b40f2-3e8c-4e97-9f45-c9f64f08450b:/dev/nvme0n1p3 f7a04997-d863-4241-83c1-789ca4b78329:/dev/nvme0n1p4] Apr 17 11:16:11.857519 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.857438 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:11.863940 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.863821 2577 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:11.86170708 +0000 UTC m=+0.399629303 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3117359 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec298ca9c5cd3a77abbbcd2f90d10a0e SystemUUID:ec298ca9-c5cd-3a77-abbb-cd2f90d10a0e BootID:74a74b80-f50c-4d35-9eca-371dbac4556f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a7:94:f8:33:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a7:94:f8:33:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ea:b1:6b:05:c0:4e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:11.863940 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.863932 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:11.864060 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.864019 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:11.865062 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.865036 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:11.865228 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.865063 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-205.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:11.865279 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.865238 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:11.865279 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.865249 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:11.865279 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.865262 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:11.865967 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.865957 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:11.866744 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.866733 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:11.866848 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.866839 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:11.869684 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.869674 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:11.869727 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.869693 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:11.869727 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.869705 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:11.869809 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.869741 2577 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:11.869809 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.869749 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:11.871048 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.871032 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:11.871133 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.871054 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:11.875480 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.875454 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:11.877386 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.877344 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:11.879064 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879034 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:11.879064 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879063 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879071 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879077 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879083 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879089 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879095 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879101 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879108 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879115 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879126 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:11.879178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.879135 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:11.880626 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.880613 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:11.880626 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.880624 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:11.881852 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.881818 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:11.882151 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.881846 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-205.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:11.883405 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.883389 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-205.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:11.884438 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.884425 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:11.884518 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.884460 2577 server.go:1295] "Started kubelet" Apr 17 11:16:11.884584 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.884554 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:11.884642 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.884591 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nkccm" Apr 17 11:16:11.884642 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.884561 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:11.884642 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.884629 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:11.885432 ip-10-0-128-205 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:11.887226 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.886408 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:11.887544 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.887532 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:11.893760 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.893742 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nkccm" Apr 17 11:16:11.893988 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.891139 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-205.ec2.internal.18a720bc79727ec4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-205.ec2.internal,UID:ip-10-0-128-205.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-205.ec2.internal,},FirstTimestamp:2026-04-17 11:16:11.884437188 +0000 UTC m=+0.422359411,LastTimestamp:2026-04-17 11:16:11.884437188 +0000 UTC m=+0.422359411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-205.ec2.internal,}" Apr 17 11:16:11.895525 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.895465 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:11.895963 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.895939 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:11.896591 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.896578 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:11.896903 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.896883 2577 factory.go:55] Registering systemd factory Apr 17 11:16:11.896986 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.896911 2577 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:11.897142 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.897073 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:11.897306 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897279 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:11.897306 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897302 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:11.897476 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897279 2577 factory.go:153] Registering CRI-O factory Apr 17 11:16:11.897476 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897335 2577 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:11.897476 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897408 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:11.897476 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897433 2577 factory.go:103] Registering Raw factory Apr 17 11:16:11.897476 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897447 2577 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:11.897476 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897478 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:11.897745 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897487 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:11.897863 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.897844 2577 manager.go:319] Starting recovery of all containers Apr 17 11:16:11.899603 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.899584 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:11.900692 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.900667 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:11.902507 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.902484 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-205.ec2.internal\" not found" node="ip-10-0-128-205.ec2.internal" Apr 17 11:16:11.910868 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.910743 2577 manager.go:324] Recovery completed Apr 17 11:16:11.914836 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.914819 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.917380 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.917352 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.917448 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.917394 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.917448 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.917404 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.917902 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.917890 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:11.917934 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.917902 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:11.917934 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.917921 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:11.920053 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.920041 2577 policy_none.go:49] "None policy: Start" Apr 17 11:16:11.920104 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.920057 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:11.920104 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.920067 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:11.967036 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.967020 2577 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.967060 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.967072 2577 server.go:85] "Starting device plugin registration server" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.967340 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.967350 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.967488 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.967562 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:11.967571 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.968056 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:11.972815 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:11.968089 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.059685 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.059606 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:12.061015 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.061000 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:12.061092 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.061029 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:12.061092 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.061047 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:12.061092 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.061055 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:12.061240 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.061097 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:12.064135 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.064114 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:12.067523 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.067511 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:12.068313 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.068297 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:12.068411 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.068332 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:12.068411 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.068344 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:12.068411 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.068384 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.081172 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.081156 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.081246 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.081178 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-205.ec2.internal\": node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.104997 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.104981 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.161410 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.161357 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal"] Apr 17 11:16:12.161500 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.161458 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:12.162795 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.162778 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:12.162872 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.162812 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:12.162872 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.162824 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:12.164023 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.164011 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:12.164684 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.164666 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:12.164684 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.164688 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:12.164877 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.164703 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:12.164877 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.164816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.164877 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.164859 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:12.165510 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.165496 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:12.165589 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.165525 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:12.165589 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.165541 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:12.165866 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.165852 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.165923 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.165879 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:12.166516 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.166502 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:12.166580 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.166531 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:12.166580 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.166544 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:12.190944 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.190920 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-205.ec2.internal\" not found" node="ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.195238 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.195223 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-205.ec2.internal\" not found" node="ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.199677 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.199651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8a71fa9642bbb7713db79711084fe6ff-config\") pod \"kube-apiserver-proxy-ip-10-0-128-205.ec2.internal\" (UID: \"8a71fa9642bbb7713db79711084fe6ff\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.199778 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.199683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3760fde5efd48d0c40ad74e563ff23b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal\" (UID: \"3760fde5efd48d0c40ad74e563ff23b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.199778 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.199703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3760fde5efd48d0c40ad74e563ff23b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal\" (UID: \"3760fde5efd48d0c40ad74e563ff23b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.205746 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.205730 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.299951 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.299915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3760fde5efd48d0c40ad74e563ff23b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal\" (UID: \"3760fde5efd48d0c40ad74e563ff23b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.299951 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.299952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8a71fa9642bbb7713db79711084fe6ff-config\") pod \"kube-apiserver-proxy-ip-10-0-128-205.ec2.internal\" (UID: \"8a71fa9642bbb7713db79711084fe6ff\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.300181 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.299971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3760fde5efd48d0c40ad74e563ff23b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal\" (UID: \"3760fde5efd48d0c40ad74e563ff23b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.300181 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.300028 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8a71fa9642bbb7713db79711084fe6ff-config\") pod \"kube-apiserver-proxy-ip-10-0-128-205.ec2.internal\" (UID: \"8a71fa9642bbb7713db79711084fe6ff\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.300181 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.300024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3760fde5efd48d0c40ad74e563ff23b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal\" (UID: \"3760fde5efd48d0c40ad74e563ff23b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.300181 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.300102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3760fde5efd48d0c40ad74e563ff23b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal\" (UID: \"3760fde5efd48d0c40ad74e563ff23b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.306056 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.306036 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.407147 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.407058 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.492241 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.492207 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.497921 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.497903 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" Apr 17 11:16:12.507615 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.507594 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.608168 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.608120 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.708698 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.708614 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.799114 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.799080 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:12.799631 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.799249 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:12.799631 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.799279 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:12.809397 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.809360 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.895580 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.895535 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:11 +0000 UTC" deadline="2028-01-08 23:55:35.057001276 +0000 UTC" Apr 17 11:16:12.895580 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.895576 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15156h39m22.161429516s" Apr 17 11:16:12.895580 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.895580 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:12.908265 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.908239 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:12.909669 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:12.909650 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-205.ec2.internal\" not found" Apr 17 11:16:12.918168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.918136 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:12.935895 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.935865 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9l8xc" Apr 17 11:16:12.945617 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.945590 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9l8xc" Apr 17 11:16:12.996762 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:12.996686 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" Apr 17 11:16:13.008628 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.008553 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:13.013172 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.013148 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" Apr 17 11:16:13.021888 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.021871 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:13.025201 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:13.025165 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3760fde5efd48d0c40ad74e563ff23b8.slice/crio-e265255cc51e93c3a920cb97a1115455c4be16339eea46ae3333d3b44bd416ae WatchSource:0}: Error finding container e265255cc51e93c3a920cb97a1115455c4be16339eea46ae3333d3b44bd416ae: Status 404 returned error can't find the container with id e265255cc51e93c3a920cb97a1115455c4be16339eea46ae3333d3b44bd416ae Apr 17 11:16:13.025551 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:13.025537 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a71fa9642bbb7713db79711084fe6ff.slice/crio-90542c54ee92cc433ec5139bdcd955c43e1eb73e363e54d88565695906ab32a0 WatchSource:0}: Error finding container 90542c54ee92cc433ec5139bdcd955c43e1eb73e363e54d88565695906ab32a0: Status 404 returned error can't find the container with id 90542c54ee92cc433ec5139bdcd955c43e1eb73e363e54d88565695906ab32a0 Apr 17 11:16:13.029432 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.029419 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:13.064498 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.064446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" event={"ID":"8a71fa9642bbb7713db79711084fe6ff","Type":"ContainerStarted","Data":"90542c54ee92cc433ec5139bdcd955c43e1eb73e363e54d88565695906ab32a0"} Apr 17 11:16:13.065410 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.065386 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" event={"ID":"3760fde5efd48d0c40ad74e563ff23b8","Type":"ContainerStarted","Data":"e265255cc51e93c3a920cb97a1115455c4be16339eea46ae3333d3b44bd416ae"} Apr 17 11:16:13.228161 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.228128 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:13.619828 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.619796 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:13.871831 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.871754 2577 apiserver.go:52] "Watching apiserver" Apr 17 11:16:13.876841 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.876816 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:13.877224 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.877200 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-r62zn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal","openshift-multus/network-metrics-daemon-dn4mx","openshift-network-operator/iptables-alerter-qbctw","openshift-dns/node-resolver-5rdwf","openshift-image-registry/node-ca-btbxk","openshift-multus/multus-additional-cni-plugins-ww4kd","openshift-multus/multus-wzjb7","openshift-network-diagnostics/network-check-target-pb4l4","openshift-ovn-kubernetes/ovnkube-node-hzwl5","kube-system/konnectivity-agent-pgpql","kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw"] Apr 17 11:16:13.880513 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.880492 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.882625 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.882602 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:13.882738 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:13.882679 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:13.884817 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.884794 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jm5kp\"" Apr 17 11:16:13.884917 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.884837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:13.885465 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.885443 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.885574 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.885548 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.885869 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.885849 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:13.886027 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.885853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:13.886455 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.886438 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:13.886912 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.886892 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-cq9wg\"" Apr 17 11:16:13.887049 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.887033 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:13.887116 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.887060 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:13.887116 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.886902 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.887227 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.887063 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.889837 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.889817 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.890096 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.890083 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-csxvt\"" Apr 17 11:16:13.890326 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.890314 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.891661 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.891640 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:13.891853 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.891836 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.893249 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.893234 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.893853 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.893835 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.894542 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.893882 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:13.894542 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.893925 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.894542 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.894149 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5fpwr\"" Apr 17 11:16:13.894542 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.894249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.894542 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.894331 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.894786 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.894648 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vzb99\"" Apr 17 11:16:13.896133 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.896115 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:13.896210 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.896120 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-f2dq9\"" Apr 17 11:16:13.896598 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.896579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:13.896686 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:13.896652 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:13.898973 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.898951 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.900697 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.900677 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.901027 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.901008 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tn8wx\"" Apr 17 11:16:13.901156 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.901142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:13.901609 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.901590 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:13.901609 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.901597 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:13.901789 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.901762 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:13.901862 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.901820 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:13.901977 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.901872 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.903043 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.903026 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:13.903518 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.903500 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xdwq8\"" Apr 17 11:16:13.903621 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.903608 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:13.903842 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.903828 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:13.906081 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.905802 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:13.906081 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.905843 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.906081 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.905985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.906081 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.905985 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9md9h\"" Apr 17 11:16:13.909381 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.909477 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfb8\" (UniqueName: \"kubernetes.io/projected/43cab76d-7c1c-49f8-8a36-79896bc24bdc-kube-api-access-4rfb8\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.909477 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9xk\" (UniqueName: \"kubernetes.io/projected/0a040950-ccaf-4d81-8e53-7c50e5eca541-kube-api-access-7f9xk\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:13.909477 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909454 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76mh4\" (UniqueName: \"kubernetes.io/projected/432289f7-2cea-4a47-8acc-2a378b04716a-kube-api-access-76mh4\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.909616 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-run-netns\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.909616 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-node-log\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.909616 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909567 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-cni-netd\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.909616 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909602 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvjk\" (UniqueName: \"kubernetes.io/projected/554f8dad-a601-4855-910a-f1e99d5cf979-kube-api-access-nfvjk\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.909749 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-var-lib-kubelet\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.909749 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909661 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6421a751-16fb-48d9-b12a-268fc1c823b1-iptables-alerter-script\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:13.909749 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-socket-dir-parent\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.909749 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.909886 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9lgh\" (UniqueName: \"kubernetes.io/projected/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-kube-api-access-v9lgh\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:13.909886 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-run\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.909886 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/432289f7-2cea-4a47-8acc-2a378b04716a-cni-binary-copy\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.909886 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-cni-bin\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.909886 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:13.910059 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.909905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-tuned\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.910380 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-cnibin\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.910448 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-hostroot\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.910448 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.910544 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910451 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90e677e5-76af-4e0b-b41e-8b6dc29ce009-tmp\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.910544 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n75d\" (UniqueName: \"kubernetes.io/projected/440a3dee-fa32-4dd9-8f44-d5532ff12996-kube-api-access-6n75d\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:13.910544 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-os-release\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.910681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-k8s-cni-cncf-io\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.910681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-cni-multus\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.910681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-systemd-units\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.910681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910627 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-system-cni-dir\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.910681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cni-binary-copy\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.910681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-lib-modules\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-host\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6421a751-16fb-48d9-b12a-268fc1c823b1-host-slash\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrrq9\" (UniqueName: \"kubernetes.io/projected/6421a751-16fb-48d9-b12a-268fc1c823b1-kube-api-access-wrrq9\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910787 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-slash\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-etc-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-kubernetes\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-env-overrides\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-systemd\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-log-socket\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.910925 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910926 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a040950-ccaf-4d81-8e53-7c50e5eca541-serviceca\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910948 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysconfig\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.910971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysctl-d\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysctl-conf\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/432289f7-2cea-4a47-8acc-2a378b04716a-multus-daemon-config\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-modprobe-d\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911075 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-multus-certs\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-ovn\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911146 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cnibin\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-sys\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-system-cni-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/554f8dad-a601-4855-910a-f1e99d5cf979-ovn-node-metrics-cert\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-os-release\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxtgt\" (UniqueName: \"kubernetes.io/projected/90e677e5-76af-4e0b-b41e-8b6dc29ce009-kube-api-access-qxtgt\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911327 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.911394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-kubelet\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-etc-kubernetes\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-var-lib-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-ovnkube-script-lib\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a040950-ccaf-4d81-8e53-7c50e5eca541-host\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/440a3dee-fa32-4dd9-8f44-d5532ff12996-hosts-file\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-cni-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-cni-bin\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911571 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-conf-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-run-ovn-kubernetes\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-ovnkube-config\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-systemd\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/440a3dee-fa32-4dd9-8f44-d5532ff12996-tmp-dir\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-netns\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:13.911937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.911733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-kubelet\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:13.946207 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.946168 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:12 +0000 UTC" deadline="2028-01-26 21:34:56.69839845 +0000 UTC" Apr 17 11:16:13.946207 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.946207 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15586h18m42.752195588s" Apr 17 11:16:13.998272 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:13.998239 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:14.012342 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.012342 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-systemd\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-log-socket\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a040950-ccaf-4d81-8e53-7c50e5eca541-serviceca\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysconfig\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-log-socket\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysctl-d\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-systemd\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysctl-conf\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/432289f7-2cea-4a47-8acc-2a378b04716a-multus-daemon-config\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysconfig\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012526 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2fa87114-d537-4e80-b08a-605d0566022a-agent-certs\") pod \"konnectivity-agent-pgpql\" (UID: \"2fa87114-d537-4e80-b08a-605d0566022a\") " pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysctl-d\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012635 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-sysctl-conf\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-modprobe-d\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-multus-certs\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012717 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-modprobe-d\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-ovn\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-multus-certs\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cnibin\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-ovn\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-sys\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cnibin\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-system-cni-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/554f8dad-a601-4855-910a-f1e99d5cf979-ovn-node-metrics-cert\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-system-cni-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012865 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2fa87114-d537-4e80-b08a-605d0566022a-konnectivity-ca\") pod \"konnectivity-agent-pgpql\" (UID: \"2fa87114-d537-4e80-b08a-605d0566022a\") " pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-os-release\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-sys\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.012998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012901 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a040950-ccaf-4d81-8e53-7c50e5eca541-serviceca\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.012987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxtgt\" (UniqueName: \"kubernetes.io/projected/90e677e5-76af-4e0b-b41e-8b6dc29ce009-kube-api-access-qxtgt\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013075 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/432289f7-2cea-4a47-8acc-2a378b04716a-multus-daemon-config\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-kubelet\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-etc-kubernetes\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013160 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013176 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-var-lib-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-os-release\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-var-lib-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-etc-kubernetes\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-ovnkube-script-lib\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013355 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-run-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-kubelet\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-device-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a040950-ccaf-4d81-8e53-7c50e5eca541-host\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/440a3dee-fa32-4dd9-8f44-d5532ff12996-hosts-file\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:14.013706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-cni-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a040950-ccaf-4d81-8e53-7c50e5eca541-host\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-cni-bin\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013592 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-cni-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/440a3dee-fa32-4dd9-8f44-d5532ff12996-hosts-file\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-conf-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-run-ovn-kubernetes\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-conf-dir\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-cni-bin\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-ovnkube-config\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-run-ovn-kubernetes\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-registration-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-systemd\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-systemd\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/440a3dee-fa32-4dd9-8f44-d5532ff12996-tmp-dir\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-ovnkube-script-lib\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-netns\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.014595 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-kubelet\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-sys-fs\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-kubelet\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.013945 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfb8\" (UniqueName: \"kubernetes.io/projected/43cab76d-7c1c-49f8-8a36-79896bc24bdc-kube-api-access-4rfb8\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9xk\" (UniqueName: \"kubernetes.io/projected/0a040950-ccaf-4d81-8e53-7c50e5eca541-kube-api-access-7f9xk\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76mh4\" (UniqueName: \"kubernetes.io/projected/432289f7-2cea-4a47-8acc-2a378b04716a-kube-api-access-76mh4\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-run-netns\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-node-log\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/440a3dee-fa32-4dd9-8f44-d5532ff12996-tmp-dir\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-cni-netd\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvjk\" (UniqueName: \"kubernetes.io/projected/554f8dad-a601-4855-910a-f1e99d5cf979-kube-api-access-nfvjk\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-var-lib-kubelet\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6421a751-16fb-48d9-b12a-268fc1c823b1-iptables-alerter-script\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-socket-dir-parent\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbwvt\" (UniqueName: \"kubernetes.io/projected/b1f63578-d509-4932-85a5-009a7eca72d5-kube-api-access-tbwvt\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.015496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-ovnkube-config\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9lgh\" (UniqueName: \"kubernetes.io/projected/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-kube-api-access-v9lgh\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-netns\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014223 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-run\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/432289f7-2cea-4a47-8acc-2a378b04716a-cni-binary-copy\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-cni-bin\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-tuned\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-run-netns\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-cni-netd\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-cnibin\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-hostroot\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-cnibin\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-etc-selinux\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-multus-socket-dir-parent\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-var-lib-kubelet\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.016256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90e677e5-76af-4e0b-b41e-8b6dc29ce009-tmp\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-cni-bin\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.014638 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.014737 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.514687419 +0000 UTC m=+3.052609654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-run\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-node-log\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-hostroot\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.014935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n75d\" (UniqueName: \"kubernetes.io/projected/440a3dee-fa32-4dd9-8f44-d5532ff12996-kube-api-access-6n75d\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-os-release\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-k8s-cni-cncf-io\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-cni-multus\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-systemd-units\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015205 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-socket-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-system-cni-dir\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.016994 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-var-lib-cni-multus\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-host-run-k8s-cni-cncf-io\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-systemd-units\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cni-binary-copy\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015447 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43cab76d-7c1c-49f8-8a36-79896bc24bdc-system-cni-dir\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-lib-modules\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-host\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6421a751-16fb-48d9-b12a-268fc1c823b1-host-slash\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrrq9\" (UniqueName: \"kubernetes.io/projected/6421a751-16fb-48d9-b12a-268fc1c823b1-kube-api-access-wrrq9\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-slash\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-etc-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-kubernetes\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-lib-modules\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-env-overrides\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43cab76d-7c1c-49f8-8a36-79896bc24bdc-cni-binary-copy\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-host-slash\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/432289f7-2cea-4a47-8acc-2a378b04716a-os-release\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.017738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6421a751-16fb-48d9-b12a-268fc1c823b1-host-slash\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6421a751-16fb-48d9-b12a-268fc1c823b1-iptables-alerter-script\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-host\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/554f8dad-a601-4855-910a-f1e99d5cf979-etc-openvswitch\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.015961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-kubernetes\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.016052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/554f8dad-a601-4855-910a-f1e99d5cf979-env-overrides\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.016221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/432289f7-2cea-4a47-8acc-2a378b04716a-cni-binary-copy\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.017695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/554f8dad-a601-4855-910a-f1e99d5cf979-ovn-node-metrics-cert\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.017708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90e677e5-76af-4e0b-b41e-8b6dc29ce009-tmp\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.018305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.018046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/90e677e5-76af-4e0b-b41e-8b6dc29ce009-etc-tuned\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.019731 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.019712 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:14.019731 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.019734 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:14.019879 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.019743 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxsh for pod openshift-network-diagnostics/network-check-target-pb4l4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.019879 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.019792 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh podName:5848b99b-c76c-47de-b92e-288c830c8a96 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.51977644 +0000 UTC m=+3.057698666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxsh" (UniqueName: "kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh") pod "network-check-target-pb4l4" (UID: "5848b99b-c76c-47de-b92e-288c830c8a96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.022532 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.022479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxtgt\" (UniqueName: \"kubernetes.io/projected/90e677e5-76af-4e0b-b41e-8b6dc29ce009-kube-api-access-qxtgt\") pod \"tuned-r62zn\" (UID: \"90e677e5-76af-4e0b-b41e-8b6dc29ce009\") " pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.023124 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.023057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76mh4\" (UniqueName: \"kubernetes.io/projected/432289f7-2cea-4a47-8acc-2a378b04716a-kube-api-access-76mh4\") pod \"multus-wzjb7\" (UID: \"432289f7-2cea-4a47-8acc-2a378b04716a\") " pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.023124 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.023081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfb8\" (UniqueName: \"kubernetes.io/projected/43cab76d-7c1c-49f8-8a36-79896bc24bdc-kube-api-access-4rfb8\") pod \"multus-additional-cni-plugins-ww4kd\" (UID: \"43cab76d-7c1c-49f8-8a36-79896bc24bdc\") " pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.023913 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.023895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n75d\" (UniqueName: \"kubernetes.io/projected/440a3dee-fa32-4dd9-8f44-d5532ff12996-kube-api-access-6n75d\") pod \"node-resolver-5rdwf\" (UID: \"440a3dee-fa32-4dd9-8f44-d5532ff12996\") " pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:14.024605 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.024567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvjk\" (UniqueName: \"kubernetes.io/projected/554f8dad-a601-4855-910a-f1e99d5cf979-kube-api-access-nfvjk\") pod \"ovnkube-node-hzwl5\" (UID: \"554f8dad-a601-4855-910a-f1e99d5cf979\") " pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.025236 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.025214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9lgh\" (UniqueName: \"kubernetes.io/projected/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-kube-api-access-v9lgh\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:14.025855 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.025814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrrq9\" (UniqueName: \"kubernetes.io/projected/6421a751-16fb-48d9-b12a-268fc1c823b1-kube-api-access-wrrq9\") pod \"iptables-alerter-qbctw\" (UID: \"6421a751-16fb-48d9-b12a-268fc1c823b1\") " pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:14.026064 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.026044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9xk\" (UniqueName: \"kubernetes.io/projected/0a040950-ccaf-4d81-8e53-7c50e5eca541-kube-api-access-7f9xk\") pod \"node-ca-btbxk\" (UID: \"0a040950-ccaf-4d81-8e53-7c50e5eca541\") " pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:14.116838 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.116798 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-device-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.116838 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.116842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-registration-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.116870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-sys-fs\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.116897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbwvt\" (UniqueName: \"kubernetes.io/projected/b1f63578-d509-4932-85a5-009a7eca72d5-kube-api-access-tbwvt\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.116918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-device-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.116934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-etc-selinux\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.116978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-socket-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.116978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-registration-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.117009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.117034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2fa87114-d537-4e80-b08a-605d0566022a-agent-certs\") pod \"konnectivity-agent-pgpql\" (UID: \"2fa87114-d537-4e80-b08a-605d0566022a\") " pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.117046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-etc-selinux\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.117055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-sys-fs\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.117067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2fa87114-d537-4e80-b08a-605d0566022a-konnectivity-ca\") pod \"konnectivity-agent-pgpql\" (UID: \"2fa87114-d537-4e80-b08a-605d0566022a\") " pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:14.117137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.117099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.117665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.117198 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b1f63578-d509-4932-85a5-009a7eca72d5-socket-dir\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.118069 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.118050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2fa87114-d537-4e80-b08a-605d0566022a-konnectivity-ca\") pod \"konnectivity-agent-pgpql\" (UID: \"2fa87114-d537-4e80-b08a-605d0566022a\") " pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:14.119909 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.119884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2fa87114-d537-4e80-b08a-605d0566022a-agent-certs\") pod \"konnectivity-agent-pgpql\" (UID: \"2fa87114-d537-4e80-b08a-605d0566022a\") " pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:14.126169 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.126106 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbwvt\" (UniqueName: \"kubernetes.io/projected/b1f63578-d509-4932-85a5-009a7eca72d5-kube-api-access-tbwvt\") pod \"aws-ebs-csi-driver-node-gstdw\" (UID: \"b1f63578-d509-4932-85a5-009a7eca72d5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.151907 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.151873 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:14.193766 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.193726 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" Apr 17 11:16:14.202632 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.202610 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qbctw" Apr 17 11:16:14.211374 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.211336 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wzjb7" Apr 17 11:16:14.218016 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.217997 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5rdwf" Apr 17 11:16:14.224652 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.224636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-btbxk" Apr 17 11:16:14.230404 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.230384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r62zn" Apr 17 11:16:14.239379 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.239341 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:14.244983 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.244963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:14.249625 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.249604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" Apr 17 11:16:14.283393 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.283200 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6421a751_16fb_48d9_b12a_268fc1c823b1.slice/crio-9380d71d6e400a3e695a0fd6fc7162d5f8632d2b9aa46ae3409df8d1730e989c WatchSource:0}: Error finding container 9380d71d6e400a3e695a0fd6fc7162d5f8632d2b9aa46ae3409df8d1730e989c: Status 404 returned error can't find the container with id 9380d71d6e400a3e695a0fd6fc7162d5f8632d2b9aa46ae3409df8d1730e989c Apr 17 11:16:14.286430 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.286342 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43cab76d_7c1c_49f8_8a36_79896bc24bdc.slice/crio-e1e8c0bb2006b0f87ae3e1e060e95689bf56d6d83bc9cc842fc7909732cb7e46 WatchSource:0}: Error finding container e1e8c0bb2006b0f87ae3e1e060e95689bf56d6d83bc9cc842fc7909732cb7e46: Status 404 returned error can't find the container with id e1e8c0bb2006b0f87ae3e1e060e95689bf56d6d83bc9cc842fc7909732cb7e46 Apr 17 11:16:14.289663 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.289635 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod554f8dad_a601_4855_910a_f1e99d5cf979.slice/crio-7fa5f14281e7d79fc9d174f40e023177d826e1a66fd9e657dae89880be9e162d WatchSource:0}: Error finding container 7fa5f14281e7d79fc9d174f40e023177d826e1a66fd9e657dae89880be9e162d: Status 404 returned error can't find the container with id 7fa5f14281e7d79fc9d174f40e023177d826e1a66fd9e657dae89880be9e162d Apr 17 11:16:14.290294 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.290251 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432289f7_2cea_4a47_8acc_2a378b04716a.slice/crio-a5b343088cc5a4c40cfe84b5c2ebc95a8e0b732055c2ad791514e9d063117343 WatchSource:0}: Error finding container a5b343088cc5a4c40cfe84b5c2ebc95a8e0b732055c2ad791514e9d063117343: Status 404 returned error can't find the container with id a5b343088cc5a4c40cfe84b5c2ebc95a8e0b732055c2ad791514e9d063117343 Apr 17 11:16:14.291154 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.291062 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1f63578_d509_4932_85a5_009a7eca72d5.slice/crio-15044030f6bbaa0e914948ad60a5988e81e1c9a6bde6cbba6c545e4d0da54a50 WatchSource:0}: Error finding container 15044030f6bbaa0e914948ad60a5988e81e1c9a6bde6cbba6c545e4d0da54a50: Status 404 returned error can't find the container with id 15044030f6bbaa0e914948ad60a5988e81e1c9a6bde6cbba6c545e4d0da54a50 Apr 17 11:16:14.292445 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.292417 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a040950_ccaf_4d81_8e53_7c50e5eca541.slice/crio-a1b4d2e5595a8d30f079fe6d6d4ba2ee4daf2fe00ff4008620d3a4627ee7e2e9 WatchSource:0}: Error finding container a1b4d2e5595a8d30f079fe6d6d4ba2ee4daf2fe00ff4008620d3a4627ee7e2e9: Status 404 returned error can't find the container with id a1b4d2e5595a8d30f079fe6d6d4ba2ee4daf2fe00ff4008620d3a4627ee7e2e9 Apr 17 11:16:14.314700 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.314672 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa87114_d537_4e80_b08a_605d0566022a.slice/crio-a43c3a82ea0671e316830217f0bcce9d75f355e33318a8ce183c270051164a6f WatchSource:0}: Error finding container a43c3a82ea0671e316830217f0bcce9d75f355e33318a8ce183c270051164a6f: Status 404 returned error can't find the container with id a43c3a82ea0671e316830217f0bcce9d75f355e33318a8ce183c270051164a6f Apr 17 11:16:14.315445 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.315416 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440a3dee_fa32_4dd9_8f44_d5532ff12996.slice/crio-ffebae4cc173c41152f853f8739492a14b38859e688ac219ed0e0c898b02f8fd WatchSource:0}: Error finding container ffebae4cc173c41152f853f8739492a14b38859e688ac219ed0e0c898b02f8fd: Status 404 returned error can't find the container with id ffebae4cc173c41152f853f8739492a14b38859e688ac219ed0e0c898b02f8fd Apr 17 11:16:14.316716 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:14.316687 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e677e5_76af_4e0b_b41e_8b6dc29ce009.slice/crio-117919c189425e5b1e9de089d5ab2f6917dd293a0af2ad5d03f072606a7cf9a1 WatchSource:0}: Error finding container 117919c189425e5b1e9de089d5ab2f6917dd293a0af2ad5d03f072606a7cf9a1: Status 404 returned error can't find the container with id 117919c189425e5b1e9de089d5ab2f6917dd293a0af2ad5d03f072606a7cf9a1 Apr 17 11:16:14.520582 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.520286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:14.520784 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.520648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:14.520784 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.520477 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:14.520784 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.520703 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:14.520784 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.520719 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxsh for pod openshift-network-diagnostics/network-check-target-pb4l4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.520784 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.520776 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.520978 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.520780 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh podName:5848b99b-c76c-47de-b92e-288c830c8a96 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:15.520760842 +0000 UTC m=+4.058683062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxsh" (UniqueName: "kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh") pod "network-check-target-pb4l4" (UID: "5848b99b-c76c-47de-b92e-288c830c8a96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.520978 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:14.520838 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:15.520821534 +0000 UTC m=+4.058743755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.946673 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.946628 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:12 +0000 UTC" deadline="2028-01-20 07:08:12.316586018 +0000 UTC" Apr 17 11:16:14.946673 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:14.946671 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15427h51m57.369919183s" Apr 17 11:16:15.062170 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.061439 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:15.062170 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:15.061567 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:15.062170 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.062008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:15.062170 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:15.062112 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:15.076413 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.075842 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" event={"ID":"8a71fa9642bbb7713db79711084fe6ff","Type":"ContainerStarted","Data":"49a60abc10b7e41d21d8992b7bacd5532c96d02230047f0e86bcd43d37e9b749"} Apr 17 11:16:15.084316 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.084263 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wzjb7" event={"ID":"432289f7-2cea-4a47-8acc-2a378b04716a","Type":"ContainerStarted","Data":"a5b343088cc5a4c40cfe84b5c2ebc95a8e0b732055c2ad791514e9d063117343"} Apr 17 11:16:15.091556 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.091496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pgpql" event={"ID":"2fa87114-d537-4e80-b08a-605d0566022a","Type":"ContainerStarted","Data":"a43c3a82ea0671e316830217f0bcce9d75f355e33318a8ce183c270051164a6f"} Apr 17 11:16:15.094649 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.094562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5rdwf" event={"ID":"440a3dee-fa32-4dd9-8f44-d5532ff12996","Type":"ContainerStarted","Data":"ffebae4cc173c41152f853f8739492a14b38859e688ac219ed0e0c898b02f8fd"} Apr 17 11:16:15.102007 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.101976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-btbxk" event={"ID":"0a040950-ccaf-4d81-8e53-7c50e5eca541","Type":"ContainerStarted","Data":"a1b4d2e5595a8d30f079fe6d6d4ba2ee4daf2fe00ff4008620d3a4627ee7e2e9"} Apr 17 11:16:15.112603 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.112572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" event={"ID":"b1f63578-d509-4932-85a5-009a7eca72d5","Type":"ContainerStarted","Data":"15044030f6bbaa0e914948ad60a5988e81e1c9a6bde6cbba6c545e4d0da54a50"} Apr 17 11:16:15.117685 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.117553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"7fa5f14281e7d79fc9d174f40e023177d826e1a66fd9e657dae89880be9e162d"} Apr 17 11:16:15.119714 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.119676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r62zn" event={"ID":"90e677e5-76af-4e0b-b41e-8b6dc29ce009","Type":"ContainerStarted","Data":"117919c189425e5b1e9de089d5ab2f6917dd293a0af2ad5d03f072606a7cf9a1"} Apr 17 11:16:15.127039 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.127009 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" event={"ID":"43cab76d-7c1c-49f8-8a36-79896bc24bdc","Type":"ContainerStarted","Data":"e1e8c0bb2006b0f87ae3e1e060e95689bf56d6d83bc9cc842fc7909732cb7e46"} Apr 17 11:16:15.132790 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.132762 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qbctw" event={"ID":"6421a751-16fb-48d9-b12a-268fc1c823b1","Type":"ContainerStarted","Data":"9380d71d6e400a3e695a0fd6fc7162d5f8632d2b9aa46ae3409df8d1730e989c"} Apr 17 11:16:15.529493 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.529442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:15.529683 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:15.529527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:15.529743 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:15.529682 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:15.529743 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:15.529702 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:15.529743 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:15.529714 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxsh for pod openshift-network-diagnostics/network-check-target-pb4l4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:15.529889 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:15.529771 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh podName:5848b99b-c76c-47de-b92e-288c830c8a96 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:17.52975226 +0000 UTC m=+6.067674476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxsh" (UniqueName: "kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh") pod "network-check-target-pb4l4" (UID: "5848b99b-c76c-47de-b92e-288c830c8a96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:15.530226 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:15.530166 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:15.530226 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:15.530219 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:17.530204567 +0000 UTC m=+6.068126779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:17.061793 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:17.061722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:17.062252 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:17.061855 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:17.062312 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:17.062284 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:17.062421 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:17.062399 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:17.546168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:17.546118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:17.546343 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:17.546186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:17.546343 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:17.546317 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:17.546479 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:17.546400 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:21.546380469 +0000 UTC m=+10.084302694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:17.546868 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:17.546850 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:17.546938 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:17.546873 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:17.546938 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:17.546885 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxsh for pod openshift-network-diagnostics/network-check-target-pb4l4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:17.546938 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:17.546927 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh podName:5848b99b-c76c-47de-b92e-288c830c8a96 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:21.546913258 +0000 UTC m=+10.084835470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxsh" (UniqueName: "kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh") pod "network-check-target-pb4l4" (UID: "5848b99b-c76c-47de-b92e-288c830c8a96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:19.061616 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:19.061580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:19.062109 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:19.061585 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:19.062109 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:19.061705 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:19.062109 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:19.061824 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:21.062097 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:21.062060 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:21.062097 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:21.062075 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:21.062586 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:21.062205 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:21.062586 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:21.062333 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:21.580408 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:21.579471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:21.580408 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:21.579536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:21.580408 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:21.579746 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:21.580408 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:21.579810 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:29.579791041 +0000 UTC m=+18.117713267 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:21.580408 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:21.580264 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:21.580408 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:21.580283 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:21.580408 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:21.580296 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxsh for pod openshift-network-diagnostics/network-check-target-pb4l4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:21.580408 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:21.580339 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh podName:5848b99b-c76c-47de-b92e-288c830c8a96 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:29.580325145 +0000 UTC m=+18.118247370 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxsh" (UniqueName: "kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh") pod "network-check-target-pb4l4" (UID: "5848b99b-c76c-47de-b92e-288c830c8a96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:23.061607 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:23.061569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:23.062048 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:23.061569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:23.062048 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:23.061708 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:23.062048 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:23.061805 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:25.061415 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:25.061384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:25.061839 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:25.061390 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:25.061839 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:25.061497 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:25.061839 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:25.061579 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:27.062107 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:27.062076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:27.062107 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:27.062093 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:27.062561 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:27.062194 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:27.062561 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:27.062334 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:29.062103 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:29.062065 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:29.062556 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:29.062070 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:29.062556 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:29.062200 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:29.062556 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:29.062345 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:29.638161 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:29.638126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:29.638336 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:29.638181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:29.638336 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:29.638292 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:29.638336 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:29.638303 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:29.638336 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:29.638322 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:29.638506 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:29.638357 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxsh for pod openshift-network-diagnostics/network-check-target-pb4l4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:29.638506 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:29.638387 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.638345863 +0000 UTC m=+34.176268084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:29.638506 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:29.638417 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh podName:5848b99b-c76c-47de-b92e-288c830c8a96 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.638405585 +0000 UTC m=+34.176327795 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxsh" (UniqueName: "kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh") pod "network-check-target-pb4l4" (UID: "5848b99b-c76c-47de-b92e-288c830c8a96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:31.061737 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:31.061700 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:31.061737 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:31.061722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:31.062304 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:31.061862 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:31.062304 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:31.062057 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:32.173102 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:32.172916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wzjb7" event={"ID":"432289f7-2cea-4a47-8acc-2a378b04716a","Type":"ContainerStarted","Data":"132c568f193be2621813e07ccf7a5d1202c2fef807b746a164f2fa888868dc8b"} Apr 17 11:16:32.174698 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:32.174676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"976ead3c4c99c62961d84fb38561a6a666e2ba45044f29f9096c7be68211e873"} Apr 17 11:16:32.174776 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:32.174706 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"7c224f1131c48403b1489b6b1a570e4e31ca88686e1ee1a2f99ead1a5cb2f257"} Apr 17 11:16:32.174776 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:32.174731 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"d8e1a1cd990e9e1ce8b3d918538cbede049af0ed883aeadd87c626a20c248f71"} Apr 17 11:16:32.176153 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:32.176131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r62zn" event={"ID":"90e677e5-76af-4e0b-b41e-8b6dc29ce009","Type":"ContainerStarted","Data":"04e3600a7fecfb407caf9b810208ba4d5976cee820bbc02a145c00ac28c0a580"} Apr 17 11:16:32.196281 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:32.194884 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-205.ec2.internal" podStartSLOduration=19.19486851 podStartE2EDuration="19.19486851s" podCreationTimestamp="2026-04-17 11:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:15.089770831 +0000 UTC m=+3.627693065" watchObservedRunningTime="2026-04-17 11:16:32.19486851 +0000 UTC m=+20.732790755" Apr 17 11:16:32.196281 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:32.195680 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wzjb7" podStartSLOduration=2.81133773 podStartE2EDuration="20.195667686s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.312999557 +0000 UTC m=+2.850921770" lastFinishedPulling="2026-04-17 11:16:31.697329516 +0000 UTC m=+20.235251726" observedRunningTime="2026-04-17 11:16:32.193580398 +0000 UTC m=+20.731502624" watchObservedRunningTime="2026-04-17 11:16:32.195667686 +0000 UTC m=+20.733589918" Apr 17 11:16:32.208778 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:32.208741 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r62zn" podStartSLOduration=2.89106233 podStartE2EDuration="20.208729227s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.318884438 +0000 UTC m=+2.856806648" lastFinishedPulling="2026-04-17 11:16:31.636551332 +0000 UTC m=+20.174473545" observedRunningTime="2026-04-17 11:16:32.208694445 +0000 UTC m=+20.746616677" watchObservedRunningTime="2026-04-17 11:16:32.208729227 +0000 UTC m=+20.746651458" Apr 17 11:16:33.061702 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.061671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:33.061877 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.061671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:33.061877 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:33.061774 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:33.061964 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:33.061873 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:33.180579 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.180547 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:16:33.181199 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.180843 2577 generic.go:358] "Generic (PLEG): container finished" podID="554f8dad-a601-4855-910a-f1e99d5cf979" containerID="7c224f1131c48403b1489b6b1a570e4e31ca88686e1ee1a2f99ead1a5cb2f257" exitCode=1 Apr 17 11:16:33.181199 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.180874 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerDied","Data":"7c224f1131c48403b1489b6b1a570e4e31ca88686e1ee1a2f99ead1a5cb2f257"} Apr 17 11:16:33.181199 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.180904 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"f9d1047d7608e264e4e2aa54cf64df5efa77f5a4bc75d3b219414f5507b9e073"} Apr 17 11:16:33.181199 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.180919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"6740daaf4e1fd64d1995b2bc30770a963c67eb55dfbb5867ad3e976511a9e1cb"} Apr 17 11:16:33.181199 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.180927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"6608278884d724d593020a09b057268efc07eb175a646b0c63fb3f4d7077c8c3"} Apr 17 11:16:33.182164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.182135 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qbctw" event={"ID":"6421a751-16fb-48d9-b12a-268fc1c823b1","Type":"ContainerStarted","Data":"667284f794bcb1e720775a3ea60d612ee9d34bf8aec44b9fb85063da36cbaa15"} Apr 17 11:16:33.195499 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:33.195445 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qbctw" podStartSLOduration=3.825379781 podStartE2EDuration="21.195431392s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.285302725 +0000 UTC m=+2.823224935" lastFinishedPulling="2026-04-17 11:16:31.655354336 +0000 UTC m=+20.193276546" observedRunningTime="2026-04-17 11:16:33.19505075 +0000 UTC m=+21.732972982" watchObservedRunningTime="2026-04-17 11:16:33.195431392 +0000 UTC m=+21.733353654" Apr 17 11:16:35.061636 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:35.061600 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:35.062003 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:35.061602 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:35.062003 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:35.061703 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:35.062003 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:35.061793 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:35.188223 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:35.188198 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:16:35.188547 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:35.188525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"cd0db1558e443c56d898432f247a6e1d95e9dfde46416489b7b49db6d02c8238"} Apr 17 11:16:36.191908 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.191867 2577 generic.go:358] "Generic (PLEG): container finished" podID="3760fde5efd48d0c40ad74e563ff23b8" containerID="e7d89167638539d185ff695ed7a4c7fb06a828abbf8c48168e660a4cbd713e94" exitCode=0 Apr 17 11:16:36.192353 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.191949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" event={"ID":"3760fde5efd48d0c40ad74e563ff23b8","Type":"ContainerDied","Data":"e7d89167638539d185ff695ed7a4c7fb06a828abbf8c48168e660a4cbd713e94"} Apr 17 11:16:36.193206 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.193183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pgpql" event={"ID":"2fa87114-d537-4e80-b08a-605d0566022a","Type":"ContainerStarted","Data":"6f309b0423671c4073a91b544eab12b82bb19ec8e0f816556a8619375772a1d1"} Apr 17 11:16:36.194549 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.194495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5rdwf" event={"ID":"440a3dee-fa32-4dd9-8f44-d5532ff12996","Type":"ContainerStarted","Data":"855da32951672a4803eb3a65d2a8a6913146cc94fe59ad48cc9557621fcfcdb7"} Apr 17 11:16:36.195815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.195795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-btbxk" event={"ID":"0a040950-ccaf-4d81-8e53-7c50e5eca541","Type":"ContainerStarted","Data":"d7d79cbace1e208efdb1e8e606bacd4d0ef1c9faa01295c5c25cee57cabd80c0"} Apr 17 11:16:36.197263 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.197244 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" event={"ID":"b1f63578-d509-4932-85a5-009a7eca72d5","Type":"ContainerStarted","Data":"79eb8ab82ba9703047ebb85a5cfaa99e81900a3c4149b6cb4c18f8a41a6c598f"} Apr 17 11:16:36.198572 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.198553 2577 generic.go:358] "Generic (PLEG): container finished" podID="43cab76d-7c1c-49f8-8a36-79896bc24bdc" containerID="1ccd7dba1306765f86b70d077a42791ba700bf7bfc2686d8ac25e6048bcddc2a" exitCode=0 Apr 17 11:16:36.198662 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.198586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" event={"ID":"43cab76d-7c1c-49f8-8a36-79896bc24bdc","Type":"ContainerDied","Data":"1ccd7dba1306765f86b70d077a42791ba700bf7bfc2686d8ac25e6048bcddc2a"} Apr 17 11:16:36.226325 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.226282 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pgpql" podStartSLOduration=11.52356711 podStartE2EDuration="24.226266751s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.319322613 +0000 UTC m=+2.857244824" lastFinishedPulling="2026-04-17 11:16:27.022022255 +0000 UTC m=+15.559944465" observedRunningTime="2026-04-17 11:16:36.225732027 +0000 UTC m=+24.763654269" watchObservedRunningTime="2026-04-17 11:16:36.226266751 +0000 UTC m=+24.764188985" Apr 17 11:16:36.260341 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.260289 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5rdwf" podStartSLOduration=6.924134799 podStartE2EDuration="24.260271752s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.319286754 +0000 UTC m=+2.857208967" lastFinishedPulling="2026-04-17 11:16:31.655423696 +0000 UTC m=+20.193345920" observedRunningTime="2026-04-17 11:16:36.260214556 +0000 UTC m=+24.798136787" watchObservedRunningTime="2026-04-17 11:16:36.260271752 +0000 UTC m=+24.798193984" Apr 17 11:16:36.274803 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.274746 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-btbxk" podStartSLOduration=6.932352577 podStartE2EDuration="24.274728753s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.312976056 +0000 UTC m=+2.850898283" lastFinishedPulling="2026-04-17 11:16:31.655352248 +0000 UTC m=+20.193274459" observedRunningTime="2026-04-17 11:16:36.274103192 +0000 UTC m=+24.812025427" watchObservedRunningTime="2026-04-17 11:16:36.274728753 +0000 UTC m=+24.812650987" Apr 17 11:16:36.719470 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.719264 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:36.980884 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.980772 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:36.719464592Z","UUID":"8b85c42a-4e9f-4a01-916b-679016d190f5","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:36.982623 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.982588 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:36.982623 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:36.982618 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:37.061937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:37.061845 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:37.062090 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:37.061845 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:37.062090 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:37.061982 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:37.062195 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:37.062093 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:37.203124 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:37.203070 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" event={"ID":"3760fde5efd48d0c40ad74e563ff23b8","Type":"ContainerStarted","Data":"ee3ef78e5b126c6a5b741a5b1f55ae0f0ae2a78b362784cc54b9efaf727901de"} Apr 17 11:16:37.205635 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:37.205603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" event={"ID":"b1f63578-d509-4932-85a5-009a7eca72d5","Type":"ContainerStarted","Data":"600d42fce442bd0d81191dbdd96f9a06bd55bca5744e31fe936336143f15104f"} Apr 17 11:16:37.208965 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:37.208946 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:16:37.209393 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:37.209344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"8996972b856c2c99fbb90e536050d88445b06a6698ced5bdff6c7f87dd7ef850"} Apr 17 11:16:37.209769 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:37.209750 2577 scope.go:117] "RemoveContainer" containerID="7c224f1131c48403b1489b6b1a570e4e31ca88686e1ee1a2f99ead1a5cb2f257" Apr 17 11:16:37.220852 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:37.220817 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-205.ec2.internal" podStartSLOduration=24.220806567 podStartE2EDuration="24.220806567s" podCreationTimestamp="2026-04-17 11:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:37.220084098 +0000 UTC m=+25.758006330" watchObservedRunningTime="2026-04-17 11:16:37.220806567 +0000 UTC m=+25.758728799" Apr 17 11:16:38.213522 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.213259 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" event={"ID":"b1f63578-d509-4932-85a5-009a7eca72d5","Type":"ContainerStarted","Data":"2e0c8fdab5f9282c61047a793ee9d032e1cd1dec31d839ea64ed614b4ecf4f8a"} Apr 17 11:16:38.216870 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.216848 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:16:38.217247 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.217167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" event={"ID":"554f8dad-a601-4855-910a-f1e99d5cf979","Type":"ContainerStarted","Data":"6d03405e7d46188a6e44c2fe89ca7967db1a362701acafe7ffc7a86f483700ab"} Apr 17 11:16:38.218003 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.217684 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:38.218003 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.217711 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:38.218003 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.217724 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:38.234757 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.234716 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gstdw" podStartSLOduration=3.071303292 podStartE2EDuration="26.234698821s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.313061787 +0000 UTC m=+2.850984018" lastFinishedPulling="2026-04-17 11:16:37.476457317 +0000 UTC m=+26.014379547" observedRunningTime="2026-04-17 11:16:38.234358374 +0000 UTC m=+26.772280619" watchObservedRunningTime="2026-04-17 11:16:38.234698821 +0000 UTC m=+26.772621054" Apr 17 11:16:38.235283 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.235264 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:38.235570 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.235554 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:16:38.298706 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:38.298652 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" podStartSLOduration=8.854527411 podStartE2EDuration="26.298637131s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.291422076 +0000 UTC m=+2.829344292" lastFinishedPulling="2026-04-17 11:16:31.735531803 +0000 UTC m=+20.273454012" observedRunningTime="2026-04-17 11:16:38.289156371 +0000 UTC m=+26.827078594" watchObservedRunningTime="2026-04-17 11:16:38.298637131 +0000 UTC m=+26.836559366" Apr 17 11:16:39.061967 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:39.061934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:39.062154 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:39.062058 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:39.062154 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:39.062114 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:39.062272 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:39.062238 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:39.084018 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:39.083977 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pb4l4"] Apr 17 11:16:39.086612 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:39.086583 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dn4mx"] Apr 17 11:16:39.218996 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:39.218962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:39.218996 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:39.218962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:39.219512 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:39.219227 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:39.219877 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:39.219853 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:39.786385 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:39.786337 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:39.787042 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:39.787023 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:40.220525 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:40.220501 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:40.221039 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:40.221020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pgpql" Apr 17 11:16:41.061576 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:41.061538 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:41.061576 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:41.061573 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:41.061745 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:41.061649 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:41.061801 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:41.061775 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:41.223646 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:41.223612 2577 generic.go:358] "Generic (PLEG): container finished" podID="43cab76d-7c1c-49f8-8a36-79896bc24bdc" containerID="2239521d240b98d45ebcd388988c2360e176e7f36e0fb866ffd4b42840411b69" exitCode=0 Apr 17 11:16:41.224003 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:41.223697 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" event={"ID":"43cab76d-7c1c-49f8-8a36-79896bc24bdc","Type":"ContainerDied","Data":"2239521d240b98d45ebcd388988c2360e176e7f36e0fb866ffd4b42840411b69"} Apr 17 11:16:42.227089 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:42.226885 2577 generic.go:358] "Generic (PLEG): container finished" podID="43cab76d-7c1c-49f8-8a36-79896bc24bdc" containerID="22dc0b07313ee083c84bc8d675ad9184ad8eeea61cc78871e565567efa77a770" exitCode=0 Apr 17 11:16:42.227552 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:42.226974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" event={"ID":"43cab76d-7c1c-49f8-8a36-79896bc24bdc","Type":"ContainerDied","Data":"22dc0b07313ee083c84bc8d675ad9184ad8eeea61cc78871e565567efa77a770"} Apr 17 11:16:43.062009 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:43.061970 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:43.062187 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:43.061978 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:43.062187 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:43.062100 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:43.062187 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:43.062154 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:43.231088 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:43.231055 2577 generic.go:358] "Generic (PLEG): container finished" podID="43cab76d-7c1c-49f8-8a36-79896bc24bdc" containerID="c8a2b1ded8847877d5366f978fe3b08873cb397373ecb767ff08594c13437655" exitCode=0 Apr 17 11:16:43.231588 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:43.231114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" event={"ID":"43cab76d-7c1c-49f8-8a36-79896bc24bdc","Type":"ContainerDied","Data":"c8a2b1ded8847877d5366f978fe3b08873cb397373ecb767ff08594c13437655"} Apr 17 11:16:45.061500 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.061419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:45.062018 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.061554 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:16:45.062018 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.061614 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:45.062018 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.061727 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pb4l4" podUID="5848b99b-c76c-47de-b92e-288c830c8a96" Apr 17 11:16:45.266937 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.266900 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-205.ec2.internal" event="NodeReady" Apr 17 11:16:45.267214 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.267061 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:45.322390 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.322295 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-crscn"] Apr 17 11:16:45.324850 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.324821 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kxgxj"] Apr 17 11:16:45.324983 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.324913 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.326486 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.326465 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:45.327561 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.327542 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bd2vh\"" Apr 17 11:16:45.327683 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.327573 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:45.327838 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.327824 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:45.328257 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.328238 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j8tns\"" Apr 17 11:16:45.328257 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.328248 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:45.328792 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.328561 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:45.328792 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.328697 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:45.339525 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.339506 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-crscn"] Apr 17 11:16:45.340047 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.340027 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kxgxj"] Apr 17 11:16:45.465651 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.465614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.465822 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.465664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/142cbae1-73ac-4077-9d7f-b3393da4de44-config-volume\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.465822 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.465738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlk65\" (UniqueName: \"kubernetes.io/projected/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-kube-api-access-vlk65\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:45.465822 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.465788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/142cbae1-73ac-4077-9d7f-b3393da4de44-tmp-dir\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.465951 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.465852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:45.465951 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.465886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsknx\" (UniqueName: \"kubernetes.io/projected/142cbae1-73ac-4077-9d7f-b3393da4de44-kube-api-access-rsknx\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.567002 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.566962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.567193 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.567020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/142cbae1-73ac-4077-9d7f-b3393da4de44-config-volume\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.567193 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.567047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlk65\" (UniqueName: \"kubernetes.io/projected/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-kube-api-access-vlk65\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:45.567193 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.567071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/142cbae1-73ac-4077-9d7f-b3393da4de44-tmp-dir\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.567193 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.567085 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:45.567193 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.567121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:45.567193 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.567162 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls podName:142cbae1-73ac-4077-9d7f-b3393da4de44 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:46.06714039 +0000 UTC m=+34.605062605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls") pod "dns-default-crscn" (UID: "142cbae1-73ac-4077-9d7f-b3393da4de44") : secret "dns-default-metrics-tls" not found Apr 17 11:16:45.567523 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.567223 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:45.567523 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.567273 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert podName:9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:46.067255821 +0000 UTC m=+34.605178050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert") pod "ingress-canary-kxgxj" (UID: "9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb") : secret "canary-serving-cert" not found Apr 17 11:16:45.567523 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.567196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsknx\" (UniqueName: \"kubernetes.io/projected/142cbae1-73ac-4077-9d7f-b3393da4de44-kube-api-access-rsknx\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.567714 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.567688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/142cbae1-73ac-4077-9d7f-b3393da4de44-config-volume\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.567868 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.567819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/142cbae1-73ac-4077-9d7f-b3393da4de44-tmp-dir\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.580510 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.580441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsknx\" (UniqueName: \"kubernetes.io/projected/142cbae1-73ac-4077-9d7f-b3393da4de44-kube-api-access-rsknx\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:45.580510 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.580499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlk65\" (UniqueName: \"kubernetes.io/projected/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-kube-api-access-vlk65\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:45.667840 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.667805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:45.668010 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:45.667867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:45.668010 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.667982 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:45.668122 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.668063 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:17.668041531 +0000 UTC m=+66.205963759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:45.668122 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.667988 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:45.668122 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.668102 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:45.668122 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.668115 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rhxsh for pod openshift-network-diagnostics/network-check-target-pb4l4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:45.668320 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:45.668181 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh podName:5848b99b-c76c-47de-b92e-288c830c8a96 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:17.668161484 +0000 UTC m=+66.206083708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rhxsh" (UniqueName: "kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh") pod "network-check-target-pb4l4" (UID: "5848b99b-c76c-47de-b92e-288c830c8a96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:46.071230 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:46.071191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:46.071895 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:46.071285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:46.071895 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:46.071316 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:46.071895 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:46.071398 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert podName:9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:47.071382426 +0000 UTC m=+35.609304641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert") pod "ingress-canary-kxgxj" (UID: "9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb") : secret "canary-serving-cert" not found Apr 17 11:16:46.071895 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:46.071403 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:46.071895 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:46.071452 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls podName:142cbae1-73ac-4077-9d7f-b3393da4de44 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:47.071436871 +0000 UTC m=+35.609359081 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls") pod "dns-default-crscn" (UID: "142cbae1-73ac-4077-9d7f-b3393da4de44") : secret "dns-default-metrics-tls" not found Apr 17 11:16:47.062400 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.062349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:16:47.062400 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.062398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:16:47.067990 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.067964 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:16:47.068507 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.068294 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:16:47.068885 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.068852 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nzl46\"" Apr 17 11:16:47.069009 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.068898 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-g9l5h\"" Apr 17 11:16:47.069338 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.069318 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:16:47.077572 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.077542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:47.077929 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:47.077623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:47.077929 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:47.077696 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:47.077929 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:47.077730 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:47.077929 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:47.077759 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert podName:9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:49.077738396 +0000 UTC m=+37.615660610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert") pod "ingress-canary-kxgxj" (UID: "9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb") : secret "canary-serving-cert" not found Apr 17 11:16:47.077929 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:47.077778 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls podName:142cbae1-73ac-4077-9d7f-b3393da4de44 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:49.077766114 +0000 UTC m=+37.615688325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls") pod "dns-default-crscn" (UID: "142cbae1-73ac-4077-9d7f-b3393da4de44") : secret "dns-default-metrics-tls" not found Apr 17 11:16:49.092204 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:49.092168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:49.092578 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:49.092238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:49.092578 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:49.092326 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:49.092578 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:49.092343 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:49.092578 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:49.092415 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls podName:142cbae1-73ac-4077-9d7f-b3393da4de44 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:53.092397428 +0000 UTC m=+41.630319638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls") pod "dns-default-crscn" (UID: "142cbae1-73ac-4077-9d7f-b3393da4de44") : secret "dns-default-metrics-tls" not found Apr 17 11:16:49.092578 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:49.092432 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert podName:9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:53.092424124 +0000 UTC m=+41.630346334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert") pod "ingress-canary-kxgxj" (UID: "9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb") : secret "canary-serving-cert" not found Apr 17 11:16:50.248659 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:50.248626 2577 generic.go:358] "Generic (PLEG): container finished" podID="43cab76d-7c1c-49f8-8a36-79896bc24bdc" containerID="fe0eed2fc9e971e03a48511f13dd305f2681a273e085b926b8ba08c2738c3daf" exitCode=0 Apr 17 11:16:50.249039 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:50.248668 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" event={"ID":"43cab76d-7c1c-49f8-8a36-79896bc24bdc","Type":"ContainerDied","Data":"fe0eed2fc9e971e03a48511f13dd305f2681a273e085b926b8ba08c2738c3daf"} Apr 17 11:16:51.252913 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:51.252878 2577 generic.go:358] "Generic (PLEG): container finished" podID="43cab76d-7c1c-49f8-8a36-79896bc24bdc" containerID="917a1c55b51435cbd62fc89952a80cec3aa981117a697dbe149b84880c09959f" exitCode=0 Apr 17 11:16:51.253413 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:51.252920 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" event={"ID":"43cab76d-7c1c-49f8-8a36-79896bc24bdc","Type":"ContainerDied","Data":"917a1c55b51435cbd62fc89952a80cec3aa981117a697dbe149b84880c09959f"} Apr 17 11:16:52.257382 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:52.257335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" event={"ID":"43cab76d-7c1c-49f8-8a36-79896bc24bdc","Type":"ContainerStarted","Data":"d08265398350af48dafd6dad9035b2944d96b1679c5fc0c0bc7b99760b27cf8a"} Apr 17 11:16:52.281091 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:52.281042 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ww4kd" podStartSLOduration=5.252308381 podStartE2EDuration="40.281028705s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.288393175 +0000 UTC m=+2.826315400" lastFinishedPulling="2026-04-17 11:16:49.317113515 +0000 UTC m=+37.855035724" observedRunningTime="2026-04-17 11:16:52.280088984 +0000 UTC m=+40.818011216" watchObservedRunningTime="2026-04-17 11:16:52.281028705 +0000 UTC m=+40.818950937" Apr 17 11:16:53.124420 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:53.124354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:16:53.124589 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:53.124470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:16:53.124589 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:53.124522 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:53.124589 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:53.124586 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert podName:9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb nodeName:}" failed. No retries permitted until 2026-04-17 11:17:01.124570096 +0000 UTC m=+49.662492308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert") pod "ingress-canary-kxgxj" (UID: "9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb") : secret "canary-serving-cert" not found Apr 17 11:16:53.124715 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:53.124605 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:53.124715 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:16:53.124654 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls podName:142cbae1-73ac-4077-9d7f-b3393da4de44 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:01.124641271 +0000 UTC m=+49.662563485 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls") pod "dns-default-crscn" (UID: "142cbae1-73ac-4077-9d7f-b3393da4de44") : secret "dns-default-metrics-tls" not found Apr 17 11:16:56.811150 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.811116 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv"] Apr 17 11:16:56.826880 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.826854 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv"] Apr 17 11:16:56.827025 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.826957 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" Apr 17 11:16:56.829605 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.829584 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:16:56.829740 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.829607 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:16:56.829740 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.829623 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-g9gcm\"" Apr 17 11:16:56.829740 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.829647 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:16:56.829740 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.829694 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 11:16:56.949200 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.949171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4f7a314a-b849-4a84-a9ba-c8fd75094d28-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65494997f7-8zfdv\" (UID: \"4f7a314a-b849-4a84-a9ba-c8fd75094d28\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" Apr 17 11:16:56.949381 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:56.949235 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k56sb\" (UniqueName: \"kubernetes.io/projected/4f7a314a-b849-4a84-a9ba-c8fd75094d28-kube-api-access-k56sb\") pod \"managed-serviceaccount-addon-agent-65494997f7-8zfdv\" (UID: \"4f7a314a-b849-4a84-a9ba-c8fd75094d28\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" Apr 17 11:16:57.049649 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:57.049612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4f7a314a-b849-4a84-a9ba-c8fd75094d28-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65494997f7-8zfdv\" (UID: \"4f7a314a-b849-4a84-a9ba-c8fd75094d28\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" Apr 17 11:16:57.049813 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:57.049665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k56sb\" (UniqueName: \"kubernetes.io/projected/4f7a314a-b849-4a84-a9ba-c8fd75094d28-kube-api-access-k56sb\") pod \"managed-serviceaccount-addon-agent-65494997f7-8zfdv\" (UID: \"4f7a314a-b849-4a84-a9ba-c8fd75094d28\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" Apr 17 11:16:57.052426 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:57.052395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4f7a314a-b849-4a84-a9ba-c8fd75094d28-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65494997f7-8zfdv\" (UID: \"4f7a314a-b849-4a84-a9ba-c8fd75094d28\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" Apr 17 11:16:57.057605 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:57.057575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k56sb\" (UniqueName: \"kubernetes.io/projected/4f7a314a-b849-4a84-a9ba-c8fd75094d28-kube-api-access-k56sb\") pod \"managed-serviceaccount-addon-agent-65494997f7-8zfdv\" (UID: \"4f7a314a-b849-4a84-a9ba-c8fd75094d28\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" Apr 17 11:16:57.145079 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:57.144996 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" Apr 17 11:16:57.304423 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:57.304395 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv"] Apr 17 11:16:57.307784 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:16:57.307736 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f7a314a_b849_4a84_a9ba_c8fd75094d28.slice/crio-3f44a79a34dbe08776d53608ef15a2a2868bcad48c8f8a6139532ec28940e55a WatchSource:0}: Error finding container 3f44a79a34dbe08776d53608ef15a2a2868bcad48c8f8a6139532ec28940e55a: Status 404 returned error can't find the container with id 3f44a79a34dbe08776d53608ef15a2a2868bcad48c8f8a6139532ec28940e55a Apr 17 11:16:58.270714 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:16:58.270673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" event={"ID":"4f7a314a-b849-4a84-a9ba-c8fd75094d28","Type":"ContainerStarted","Data":"3f44a79a34dbe08776d53608ef15a2a2868bcad48c8f8a6139532ec28940e55a"} Apr 17 11:17:01.181970 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:01.181936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:17:01.182478 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:01.181993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:17:01.182478 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:01.182088 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:01.182478 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:01.182092 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:01.182478 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:01.182151 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert podName:9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb nodeName:}" failed. No retries permitted until 2026-04-17 11:17:17.182137516 +0000 UTC m=+65.720059726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert") pod "ingress-canary-kxgxj" (UID: "9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb") : secret "canary-serving-cert" not found Apr 17 11:17:01.182478 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:01.182164 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls podName:142cbae1-73ac-4077-9d7f-b3393da4de44 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:17.182158262 +0000 UTC m=+65.720080471 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls") pod "dns-default-crscn" (UID: "142cbae1-73ac-4077-9d7f-b3393da4de44") : secret "dns-default-metrics-tls" not found Apr 17 11:17:01.277304 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:01.277268 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" event={"ID":"4f7a314a-b849-4a84-a9ba-c8fd75094d28","Type":"ContainerStarted","Data":"923cd1d0129ebc7cdc08eb63f35c7afc072faa44603aff8286c22815b3008415"} Apr 17 11:17:01.291518 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:01.291459 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" podStartSLOduration=2.151164745 podStartE2EDuration="5.29144319s" podCreationTimestamp="2026-04-17 11:16:56 +0000 UTC" firstStartedPulling="2026-04-17 11:16:57.30955472 +0000 UTC m=+45.847476936" lastFinishedPulling="2026-04-17 11:17:00.449833167 +0000 UTC m=+48.987755381" observedRunningTime="2026-04-17 11:17:01.291337158 +0000 UTC m=+49.829259516" watchObservedRunningTime="2026-04-17 11:17:01.29144319 +0000 UTC m=+49.829365423" Apr 17 11:17:10.230960 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:10.230925 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hzwl5" Apr 17 11:17:17.191509 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.191459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:17:17.191896 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.191529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:17:17.191896 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:17.191609 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:17.191896 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:17.191618 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:17.191896 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:17.191669 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert podName:9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb nodeName:}" failed. No retries permitted until 2026-04-17 11:17:49.191654572 +0000 UTC m=+97.729576783 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert") pod "ingress-canary-kxgxj" (UID: "9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb") : secret "canary-serving-cert" not found Apr 17 11:17:17.191896 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:17.191684 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls podName:142cbae1-73ac-4077-9d7f-b3393da4de44 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:49.191677336 +0000 UTC m=+97.729599547 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls") pod "dns-default-crscn" (UID: "142cbae1-73ac-4077-9d7f-b3393da4de44") : secret "dns-default-metrics-tls" not found Apr 17 11:17:17.696336 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.696296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:17:17.696559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.696426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:17:17.698618 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.698594 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:17.698719 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.698602 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:17.707189 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:17.707172 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:17.707244 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:17.707232 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:21.707210603 +0000 UTC m=+130.245132812 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : secret "metrics-daemon-secret" not found Apr 17 11:17:17.709057 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.709044 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:17.719968 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.719951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxsh\" (UniqueName: \"kubernetes.io/projected/5848b99b-c76c-47de-b92e-288c830c8a96-kube-api-access-rhxsh\") pod \"network-check-target-pb4l4\" (UID: \"5848b99b-c76c-47de-b92e-288c830c8a96\") " pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:17:17.983804 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.983725 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nzl46\"" Apr 17 11:17:17.992220 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:17.992204 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:17:18.103195 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:18.103163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pb4l4"] Apr 17 11:17:18.108246 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:17:18.108218 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5848b99b_c76c_47de_b92e_288c830c8a96.slice/crio-666b31ad24203d09d49c41f24593d2ada076bf3bea01c1514fa59376cbb496f9 WatchSource:0}: Error finding container 666b31ad24203d09d49c41f24593d2ada076bf3bea01c1514fa59376cbb496f9: Status 404 returned error can't find the container with id 666b31ad24203d09d49c41f24593d2ada076bf3bea01c1514fa59376cbb496f9 Apr 17 11:17:18.312117 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:18.312032 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pb4l4" event={"ID":"5848b99b-c76c-47de-b92e-288c830c8a96","Type":"ContainerStarted","Data":"666b31ad24203d09d49c41f24593d2ada076bf3bea01c1514fa59376cbb496f9"} Apr 17 11:17:21.319300 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:21.319263 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pb4l4" event={"ID":"5848b99b-c76c-47de-b92e-288c830c8a96","Type":"ContainerStarted","Data":"85a2b262c00cebb9e550c46b1c38dfce303fb14830ffb253497759f0a9f56cad"} Apr 17 11:17:21.319760 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:21.319526 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:17:21.334710 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:21.334661 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pb4l4" podStartSLOduration=66.792496214 podStartE2EDuration="1m9.334650044s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:17:18.110143328 +0000 UTC m=+66.648065538" lastFinishedPulling="2026-04-17 11:17:20.652297155 +0000 UTC m=+69.190219368" observedRunningTime="2026-04-17 11:17:21.33368354 +0000 UTC m=+69.871605771" watchObservedRunningTime="2026-04-17 11:17:21.334650044 +0000 UTC m=+69.872572275" Apr 17 11:17:49.219581 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:49.219540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:17:49.219961 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:49.219595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:17:49.219961 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:49.219691 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:49.219961 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:49.219693 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:49.219961 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:49.219768 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert podName:9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb nodeName:}" failed. No retries permitted until 2026-04-17 11:18:53.219749991 +0000 UTC m=+161.757672204 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert") pod "ingress-canary-kxgxj" (UID: "9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb") : secret "canary-serving-cert" not found Apr 17 11:17:49.219961 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:17:49.219782 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls podName:142cbae1-73ac-4077-9d7f-b3393da4de44 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:53.219776025 +0000 UTC m=+161.757698234 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls") pod "dns-default-crscn" (UID: "142cbae1-73ac-4077-9d7f-b3393da4de44") : secret "dns-default-metrics-tls" not found Apr 17 11:17:52.323814 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:17:52.323784 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pb4l4" Apr 17 11:18:08.024813 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.024766 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7cbc658598-mr5gb"] Apr 17 11:18:08.027582 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.027552 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.029575 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.029559 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hnq7c\"" Apr 17 11:18:08.029677 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.029559 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 11:18:08.029722 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.029698 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 11:18:08.029722 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.029707 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 11:18:08.029845 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.029830 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 11:18:08.030170 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.030153 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 11:18:08.030271 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.030252 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 11:18:08.039288 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.039259 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7cbc658598-mr5gb"] Apr 17 11:18:08.145065 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.145033 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.145214 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.145081 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.145214 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.145100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2d6w\" (UniqueName: \"kubernetes.io/projected/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-kube-api-access-c2d6w\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.145214 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.145173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-default-certificate\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.145214 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.145209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-stats-auth\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.246102 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.246077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.246254 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.246108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2d6w\" (UniqueName: \"kubernetes.io/projected/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-kube-api-access-c2d6w\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.246254 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:08.246214 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:08.246254 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.246232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-default-certificate\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.246375 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:08.246283 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:08.746264335 +0000 UTC m=+117.284186548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : secret "router-metrics-certs-default" not found Apr 17 11:18:08.246375 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.246304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-stats-auth\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.246463 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.246386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.246531 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:08.246519 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:08.746505384 +0000 UTC m=+117.284427614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:08.248629 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.248602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-default-certificate\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.248717 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.248649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-stats-auth\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.256684 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.256652 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2d6w\" (UniqueName: \"kubernetes.io/projected/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-kube-api-access-c2d6w\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.749320 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.749269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.749320 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:08.749333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:08.749606 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:08.749458 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:08.749606 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:08.749499 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:09.749476164 +0000 UTC m=+118.287398388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:08.749606 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:08.749526 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:09.74951639 +0000 UTC m=+118.287438607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : secret "router-metrics-certs-default" not found Apr 17 11:18:09.756651 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:09.756616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:09.757076 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:09.756670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:09.757076 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:09.756783 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:09.757076 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:09.756822 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:11.756799749 +0000 UTC m=+120.294721971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:09.757076 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:09.756853 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:11.756842573 +0000 UTC m=+120.294764782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : secret "router-metrics-certs-default" not found Apr 17 11:18:11.771908 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:11.771872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:11.772336 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:11.771948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:11.772336 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:11.772057 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:11.772336 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:11.772074 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:15.772057093 +0000 UTC m=+124.309979305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:11.772336 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:11.772125 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:15.772106375 +0000 UTC m=+124.310028585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : secret "router-metrics-certs-default" not found Apr 17 11:18:13.001535 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:13.001506 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5rdwf_440a3dee-fa32-4dd9-8f44-d5532ff12996/dns-node-resolver/0.log" Apr 17 11:18:13.802786 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:13.802760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-btbxk_0a040950-ccaf-4d81-8e53-7c50e5eca541/node-ca/0.log" Apr 17 11:18:15.798877 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:15.798833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:15.799314 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:15.798932 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:15.799314 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:15.798983 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:15.799314 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:15.799063 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:23.799048206 +0000 UTC m=+132.336970420 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : secret "router-metrics-certs-default" not found Apr 17 11:18:15.799314 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:15.799088 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:23.799070347 +0000 UTC m=+132.336992564 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:17.749553 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.749521 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h"] Apr 17 11:18:17.751261 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.751245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.753802 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.753782 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:17.753916 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.753827 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 11:18:17.754012 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.754000 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 11:18:17.754883 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.754863 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-k8jlc\"" Apr 17 11:18:17.755114 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.754866 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:17.760439 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.760416 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h"] Apr 17 11:18:17.811674 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.811629 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg4sw\" (UniqueName: \"kubernetes.io/projected/241708e9-6c54-4758-aa09-fa52e406c967-kube-api-access-lg4sw\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.811851 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.811718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/241708e9-6c54-4758-aa09-fa52e406c967-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.811851 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.811744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241708e9-6c54-4758-aa09-fa52e406c967-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.850772 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.850739 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq"] Apr 17 11:18:17.852468 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.852452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:17.854664 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.854634 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:17.854664 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.854651 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 11:18:17.854836 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.854635 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:17.854836 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.854654 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zc6sg\"" Apr 17 11:18:17.854836 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.854635 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 11:18:17.863942 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.863919 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq"] Apr 17 11:18:17.912181 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.912141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg4sw\" (UniqueName: \"kubernetes.io/projected/241708e9-6c54-4758-aa09-fa52e406c967-kube-api-access-lg4sw\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.912351 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.912191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8e8dee-ad5b-4371-84f8-4e123925013a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:17.912351 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.912247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bb8m\" (UniqueName: \"kubernetes.io/projected/9e8e8dee-ad5b-4371-84f8-4e123925013a-kube-api-access-5bb8m\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:17.912351 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.912329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e8e8dee-ad5b-4371-84f8-4e123925013a-config\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:17.912490 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.912382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/241708e9-6c54-4758-aa09-fa52e406c967-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.912490 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.912405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241708e9-6c54-4758-aa09-fa52e406c967-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.912941 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.912923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241708e9-6c54-4758-aa09-fa52e406c967-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.914665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.914647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/241708e9-6c54-4758-aa09-fa52e406c967-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:17.920545 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:17.920518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg4sw\" (UniqueName: \"kubernetes.io/projected/241708e9-6c54-4758-aa09-fa52e406c967-kube-api-access-lg4sw\") pod \"kube-storage-version-migrator-operator-6769c5d45-6sx6h\" (UID: \"241708e9-6c54-4758-aa09-fa52e406c967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:18.013150 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.013062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8e8dee-ad5b-4371-84f8-4e123925013a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:18.013150 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.013108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bb8m\" (UniqueName: \"kubernetes.io/projected/9e8e8dee-ad5b-4371-84f8-4e123925013a-kube-api-access-5bb8m\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:18.013150 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.013128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e8e8dee-ad5b-4371-84f8-4e123925013a-config\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:18.013630 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.013602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e8e8dee-ad5b-4371-84f8-4e123925013a-config\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:18.015571 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.015550 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8e8dee-ad5b-4371-84f8-4e123925013a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:18.020935 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.020909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bb8m\" (UniqueName: \"kubernetes.io/projected/9e8e8dee-ad5b-4371-84f8-4e123925013a-kube-api-access-5bb8m\") pod \"service-ca-operator-d6fc45fc5-45hcq\" (UID: \"9e8e8dee-ad5b-4371-84f8-4e123925013a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:18.062299 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.062270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" Apr 17 11:18:18.161504 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.161473 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" Apr 17 11:18:18.177260 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.177183 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h"] Apr 17 11:18:18.179601 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:18.179573 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241708e9_6c54_4758_aa09_fa52e406c967.slice/crio-b411d22fad028588c3d3aff1d76cd3e1ea30ea52480327276b149acf8531f49a WatchSource:0}: Error finding container b411d22fad028588c3d3aff1d76cd3e1ea30ea52480327276b149acf8531f49a: Status 404 returned error can't find the container with id b411d22fad028588c3d3aff1d76cd3e1ea30ea52480327276b149acf8531f49a Apr 17 11:18:18.278051 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.277972 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq"] Apr 17 11:18:18.281226 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:18.281201 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e8e8dee_ad5b_4371_84f8_4e123925013a.slice/crio-667de614799f51b3d9041a9d73790ba1eaee8306020d9260377d387b574e8be3 WatchSource:0}: Error finding container 667de614799f51b3d9041a9d73790ba1eaee8306020d9260377d387b574e8be3: Status 404 returned error can't find the container with id 667de614799f51b3d9041a9d73790ba1eaee8306020d9260377d387b574e8be3 Apr 17 11:18:18.429553 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.429510 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" event={"ID":"9e8e8dee-ad5b-4371-84f8-4e123925013a","Type":"ContainerStarted","Data":"667de614799f51b3d9041a9d73790ba1eaee8306020d9260377d387b574e8be3"} Apr 17 11:18:18.430411 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:18.430387 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" event={"ID":"241708e9-6c54-4758-aa09-fa52e406c967","Type":"ContainerStarted","Data":"b411d22fad028588c3d3aff1d76cd3e1ea30ea52480327276b149acf8531f49a"} Apr 17 11:18:21.437822 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.437782 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" event={"ID":"9e8e8dee-ad5b-4371-84f8-4e123925013a","Type":"ContainerStarted","Data":"d5fbb67ea1a5bc848c3c561cc3eacd3c68ae9ac343aa1c39b8c0a1e13c442dec"} Apr 17 11:18:21.439121 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.439092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" event={"ID":"241708e9-6c54-4758-aa09-fa52e406c967","Type":"ContainerStarted","Data":"6416f40b9510bc7f9c09933eb63d903be2d6098109bb52257fb180e788bd2292"} Apr 17 11:18:21.452684 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.452631 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" podStartSLOduration=2.233666212 podStartE2EDuration="4.452612417s" podCreationTimestamp="2026-04-17 11:18:17 +0000 UTC" firstStartedPulling="2026-04-17 11:18:18.283020796 +0000 UTC m=+126.820943006" lastFinishedPulling="2026-04-17 11:18:20.501967001 +0000 UTC m=+129.039889211" observedRunningTime="2026-04-17 11:18:21.451615665 +0000 UTC m=+129.989537898" watchObservedRunningTime="2026-04-17 11:18:21.452612417 +0000 UTC m=+129.990534653" Apr 17 11:18:21.466321 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.466277 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" podStartSLOduration=2.146998033 podStartE2EDuration="4.466265332s" podCreationTimestamp="2026-04-17 11:18:17 +0000 UTC" firstStartedPulling="2026-04-17 11:18:18.181533374 +0000 UTC m=+126.719455584" lastFinishedPulling="2026-04-17 11:18:20.50080067 +0000 UTC m=+129.038722883" observedRunningTime="2026-04-17 11:18:21.465162216 +0000 UTC m=+130.003084449" watchObservedRunningTime="2026-04-17 11:18:21.466265332 +0000 UTC m=+130.004187563" Apr 17 11:18:21.714442 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.714403 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2"] Apr 17 11:18:21.716341 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.716321 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" Apr 17 11:18:21.718614 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.718594 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6f469\"" Apr 17 11:18:21.718843 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.718822 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 11:18:21.719071 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.719054 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:21.726381 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.726339 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2"] Apr 17 11:18:21.742724 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.742698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:18:21.742818 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:21.742796 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:21.742868 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:21.742846 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs podName:ecbf8c24-6e0b-4d26-9530-6bcc59825ca0 nodeName:}" failed. No retries permitted until 2026-04-17 11:20:23.742832827 +0000 UTC m=+252.280755037 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs") pod "network-metrics-daemon-dn4mx" (UID: "ecbf8c24-6e0b-4d26-9530-6bcc59825ca0") : secret "metrics-daemon-secret" not found Apr 17 11:18:21.843694 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.843657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g9zl\" (UniqueName: \"kubernetes.io/projected/f882661d-584f-41d1-9758-7e68f8c80cc5-kube-api-access-6g9zl\") pod \"migrator-74bb7799d9-vgmt2\" (UID: \"f882661d-584f-41d1-9758-7e68f8c80cc5\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" Apr 17 11:18:21.944879 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.944843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g9zl\" (UniqueName: \"kubernetes.io/projected/f882661d-584f-41d1-9758-7e68f8c80cc5-kube-api-access-6g9zl\") pod \"migrator-74bb7799d9-vgmt2\" (UID: \"f882661d-584f-41d1-9758-7e68f8c80cc5\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" Apr 17 11:18:21.952973 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:21.952952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g9zl\" (UniqueName: \"kubernetes.io/projected/f882661d-584f-41d1-9758-7e68f8c80cc5-kube-api-access-6g9zl\") pod \"migrator-74bb7799d9-vgmt2\" (UID: \"f882661d-584f-41d1-9758-7e68f8c80cc5\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" Apr 17 11:18:22.025678 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.025587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" Apr 17 11:18:22.146735 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.146707 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2"] Apr 17 11:18:22.150069 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:22.150036 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf882661d_584f_41d1_9758_7e68f8c80cc5.slice/crio-393827a65e88fe4ef5a0b8d33db67f34a802418098f85d060a068da6fdb80a9e WatchSource:0}: Error finding container 393827a65e88fe4ef5a0b8d33db67f34a802418098f85d060a068da6fdb80a9e: Status 404 returned error can't find the container with id 393827a65e88fe4ef5a0b8d33db67f34a802418098f85d060a068da6fdb80a9e Apr 17 11:18:22.389773 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.389705 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t"] Apr 17 11:18:22.392027 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.392011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t" Apr 17 11:18:22.393993 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.393977 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5ms6d\"" Apr 17 11:18:22.401335 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.401312 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t"] Apr 17 11:18:22.442602 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.442573 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" event={"ID":"f882661d-584f-41d1-9758-7e68f8c80cc5","Type":"ContainerStarted","Data":"393827a65e88fe4ef5a0b8d33db67f34a802418098f85d060a068da6fdb80a9e"} Apr 17 11:18:22.549807 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.549765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhnm2\" (UniqueName: \"kubernetes.io/projected/43381832-f484-4811-93b2-5d729f55a9c7-kube-api-access-xhnm2\") pod \"network-check-source-8894fc9bd-mqf7t\" (UID: \"43381832-f484-4811-93b2-5d729f55a9c7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t" Apr 17 11:18:22.651209 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.651110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhnm2\" (UniqueName: \"kubernetes.io/projected/43381832-f484-4811-93b2-5d729f55a9c7-kube-api-access-xhnm2\") pod \"network-check-source-8894fc9bd-mqf7t\" (UID: \"43381832-f484-4811-93b2-5d729f55a9c7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t" Apr 17 11:18:22.659954 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.659925 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhnm2\" (UniqueName: \"kubernetes.io/projected/43381832-f484-4811-93b2-5d729f55a9c7-kube-api-access-xhnm2\") pod \"network-check-source-8894fc9bd-mqf7t\" (UID: \"43381832-f484-4811-93b2-5d729f55a9c7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t" Apr 17 11:18:22.701538 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.701503 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t" Apr 17 11:18:22.828254 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:22.828223 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t"] Apr 17 11:18:22.832039 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:22.832006 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43381832_f484_4811_93b2_5d729f55a9c7.slice/crio-55228899fb52d03d7140119fd0eed02eb4c30100ba74bf81f4f98c58d7e1ed33 WatchSource:0}: Error finding container 55228899fb52d03d7140119fd0eed02eb4c30100ba74bf81f4f98c58d7e1ed33: Status 404 returned error can't find the container with id 55228899fb52d03d7140119fd0eed02eb4c30100ba74bf81f4f98c58d7e1ed33 Apr 17 11:18:23.193499 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.193478 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b7c457569-wmbwt"] Apr 17 11:18:23.195169 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.195152 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.198793 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.198771 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:18:23.199271 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.199247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xz4zw\"" Apr 17 11:18:23.199738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.199447 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:18:23.199738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.199730 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:18:23.209496 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.206630 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:18:23.213949 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.213923 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b7c457569-wmbwt"] Apr 17 11:18:23.356128 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.356092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.356128 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.356127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4d2\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-kube-api-access-nz4d2\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.356464 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.356161 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-certificates\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.356464 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.356229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-trusted-ca\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.356464 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.356271 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-image-registry-private-configuration\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.356464 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.356295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-ca-trust-extracted\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.356464 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.356316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-installation-pull-secrets\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.356464 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.356344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-bound-sa-token\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.446867 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.446835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t" event={"ID":"43381832-f484-4811-93b2-5d729f55a9c7","Type":"ContainerStarted","Data":"9e19f0f359a7f9fea204ec2a08c7774341ead8832dd234124cd6cbb41cf5a375"} Apr 17 11:18:23.447309 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.446876 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t" event={"ID":"43381832-f484-4811-93b2-5d729f55a9c7","Type":"ContainerStarted","Data":"55228899fb52d03d7140119fd0eed02eb4c30100ba74bf81f4f98c58d7e1ed33"} Apr 17 11:18:23.448497 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.448471 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" event={"ID":"f882661d-584f-41d1-9758-7e68f8c80cc5","Type":"ContainerStarted","Data":"c25af3d9141d50be3651ab0d0ee732010985fd8560f98de534c5eb007376588b"} Apr 17 11:18:23.448613 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.448504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" event={"ID":"f882661d-584f-41d1-9758-7e68f8c80cc5","Type":"ContainerStarted","Data":"9789e58d9159ca73dedf140075a4efe252b44b4ef7c7d7bb76f6074eb4f0e36d"} Apr 17 11:18:23.457297 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.457274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.457402 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.457315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4d2\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-kube-api-access-nz4d2\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.457402 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.457354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-certificates\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.457504 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.457422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-trusted-ca\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.457504 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.457422 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:23.457504 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.457469 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b7c457569-wmbwt: secret "image-registry-tls" not found Apr 17 11:18:23.457639 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.457522 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls podName:abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:23.957504165 +0000 UTC m=+132.495426381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls") pod "image-registry-b7c457569-wmbwt" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84") : secret "image-registry-tls" not found Apr 17 11:18:23.457639 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.457561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-image-registry-private-configuration\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.457639 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.457607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-ca-trust-extracted\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.457639 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.457627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-installation-pull-secrets\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.457835 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.457656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-bound-sa-token\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.458066 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.458038 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-certificates\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.458130 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.458089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-ca-trust-extracted\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.458312 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.458287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-trusted-ca\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.460168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.460148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-installation-pull-secrets\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.460464 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.460445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-image-registry-private-configuration\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.461607 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.461574 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mqf7t" podStartSLOduration=1.461562498 podStartE2EDuration="1.461562498s" podCreationTimestamp="2026-04-17 11:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:23.461063971 +0000 UTC m=+131.998986203" watchObservedRunningTime="2026-04-17 11:18:23.461562498 +0000 UTC m=+131.999484729" Apr 17 11:18:23.466260 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.466234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4d2\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-kube-api-access-nz4d2\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.466503 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.466484 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-bound-sa-token\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.475470 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.475436 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vgmt2" podStartSLOduration=1.4398172439999999 podStartE2EDuration="2.475425825s" podCreationTimestamp="2026-04-17 11:18:21 +0000 UTC" firstStartedPulling="2026-04-17 11:18:22.152314986 +0000 UTC m=+130.690237200" lastFinishedPulling="2026-04-17 11:18:23.187923557 +0000 UTC m=+131.725845781" observedRunningTime="2026-04-17 11:18:23.474634687 +0000 UTC m=+132.012556920" watchObservedRunningTime="2026-04-17 11:18:23.475425825 +0000 UTC m=+132.013348057" Apr 17 11:18:23.861272 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.861177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:23.861453 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.861272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:23.861453 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.861334 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:39.861317014 +0000 UTC m=+148.399239229 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:23.861453 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.861404 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:23.861580 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.861472 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs podName:ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c nodeName:}" failed. No retries permitted until 2026-04-17 11:18:39.861454306 +0000 UTC m=+148.399376520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs") pod "router-default-7cbc658598-mr5gb" (UID: "ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c") : secret "router-metrics-certs-default" not found Apr 17 11:18:23.962497 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:23.962465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:23.962663 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.962641 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:23.962723 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.962667 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b7c457569-wmbwt: secret "image-registry-tls" not found Apr 17 11:18:23.962772 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:23.962740 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls podName:abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:24.962717815 +0000 UTC m=+133.500640041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls") pod "image-registry-b7c457569-wmbwt" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84") : secret "image-registry-tls" not found Apr 17 11:18:24.971319 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:24.971274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:24.971739 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:24.971461 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:24.971739 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:24.971481 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b7c457569-wmbwt: secret "image-registry-tls" not found Apr 17 11:18:24.971739 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:24.971539 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls podName:abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:26.971524374 +0000 UTC m=+135.509446588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls") pod "image-registry-b7c457569-wmbwt" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84") : secret "image-registry-tls" not found Apr 17 11:18:26.985821 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:26.985770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:26.986203 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:26.985920 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:26.986203 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:26.985941 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b7c457569-wmbwt: secret "image-registry-tls" not found Apr 17 11:18:26.986203 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:26.985995 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls podName:abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:30.985979864 +0000 UTC m=+139.523902074 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls") pod "image-registry-b7c457569-wmbwt" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84") : secret "image-registry-tls" not found Apr 17 11:18:31.019461 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:31.019419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:31.019836 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:31.019567 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:31.019836 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:31.019589 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b7c457569-wmbwt: secret "image-registry-tls" not found Apr 17 11:18:31.019836 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:31.019645 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls podName:abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:39.019629111 +0000 UTC m=+147.557551322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls") pod "image-registry-b7c457569-wmbwt" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84") : secret "image-registry-tls" not found Apr 17 11:18:39.084152 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.084115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:39.086657 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.086626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"image-registry-b7c457569-wmbwt\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:39.140667 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.140637 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:39.264437 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.264406 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b7c457569-wmbwt"] Apr 17 11:18:39.267654 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:39.267626 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabb2cb8a_baca_4bf5_a65c_20d8f7e8ad84.slice/crio-1d144e47c2bfa228296bb453ea7b048dc9603b58f81da36f94cfe533310894d5 WatchSource:0}: Error finding container 1d144e47c2bfa228296bb453ea7b048dc9603b58f81da36f94cfe533310894d5: Status 404 returned error can't find the container with id 1d144e47c2bfa228296bb453ea7b048dc9603b58f81da36f94cfe533310894d5 Apr 17 11:18:39.489521 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.489484 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" event={"ID":"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84","Type":"ContainerStarted","Data":"e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253"} Apr 17 11:18:39.489521 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.489521 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" event={"ID":"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84","Type":"ContainerStarted","Data":"1d144e47c2bfa228296bb453ea7b048dc9603b58f81da36f94cfe533310894d5"} Apr 17 11:18:39.489758 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.489617 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:18:39.508116 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.508063 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" podStartSLOduration=16.50804878 podStartE2EDuration="16.50804878s" podCreationTimestamp="2026-04-17 11:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:39.507601971 +0000 UTC m=+148.045524212" watchObservedRunningTime="2026-04-17 11:18:39.50804878 +0000 UTC m=+148.045971012" Apr 17 11:18:39.891681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.891603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:39.891681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.891671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:39.892330 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.892309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-service-ca-bundle\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:39.894046 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:39.894029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c-metrics-certs\") pod \"router-default-7cbc658598-mr5gb\" (UID: \"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c\") " pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:40.139800 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:40.139769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hnq7c\"" Apr 17 11:18:40.147634 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:40.147564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:40.269056 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:40.269023 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7cbc658598-mr5gb"] Apr 17 11:18:40.272731 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:40.272698 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad9fdeae_eb5e_4ef6_8d92_555c9c355b3c.slice/crio-c0ab63ae844e6f06158f939cd368db486a92a9969e705d58d2ca2cd13f20581e WatchSource:0}: Error finding container c0ab63ae844e6f06158f939cd368db486a92a9969e705d58d2ca2cd13f20581e: Status 404 returned error can't find the container with id c0ab63ae844e6f06158f939cd368db486a92a9969e705d58d2ca2cd13f20581e Apr 17 11:18:40.494143 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:40.494097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7cbc658598-mr5gb" event={"ID":"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c","Type":"ContainerStarted","Data":"e0a8177a4365a4ed7b00591e066f3108c3b450d0b9f3f640ef3c7fe328c6cc2a"} Apr 17 11:18:40.494500 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:40.494149 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7cbc658598-mr5gb" event={"ID":"ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c","Type":"ContainerStarted","Data":"c0ab63ae844e6f06158f939cd368db486a92a9969e705d58d2ca2cd13f20581e"} Apr 17 11:18:40.513102 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:40.513056 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7cbc658598-mr5gb" podStartSLOduration=32.513042355 podStartE2EDuration="32.513042355s" podCreationTimestamp="2026-04-17 11:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:40.512350845 +0000 UTC m=+149.050273078" watchObservedRunningTime="2026-04-17 11:18:40.513042355 +0000 UTC m=+149.050964581" Apr 17 11:18:41.148473 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:41.148435 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:41.150824 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:41.150802 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:41.496127 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:41.496094 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:41.497349 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:41.497331 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7cbc658598-mr5gb" Apr 17 11:18:43.567837 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.567808 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8h88w"] Apr 17 11:18:43.571389 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.571338 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.573688 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.573664 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-64q7x\"" Apr 17 11:18:43.573840 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.573673 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:18:43.574557 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.574517 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:18:43.574688 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.574518 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:18:43.574688 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.574679 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:18:43.585577 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.585551 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8h88w"] Apr 17 11:18:43.619170 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.619120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/54915c0c-e586-4854-9937-807160b46bb8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.619336 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.619190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/54915c0c-e586-4854-9937-807160b46bb8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.619336 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.619277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqtr4\" (UniqueName: \"kubernetes.io/projected/54915c0c-e586-4854-9937-807160b46bb8-kube-api-access-wqtr4\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.619336 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.619332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/54915c0c-e586-4854-9937-807160b46bb8-data-volume\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.619509 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.619402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/54915c0c-e586-4854-9937-807160b46bb8-crio-socket\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.638026 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.637996 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b7c457569-wmbwt"] Apr 17 11:18:43.693671 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.693639 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9d9f47c87-m5vrb"] Apr 17 11:18:43.696530 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.696507 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.710589 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.710560 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9d9f47c87-m5vrb"] Apr 17 11:18:43.720707 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-registry-certificates\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.720707 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-ca-trust-extracted\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.720890 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqtr4\" (UniqueName: \"kubernetes.io/projected/54915c0c-e586-4854-9937-807160b46bb8-kube-api-access-wqtr4\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.720890 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-registry-tls\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.720890 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jrzs\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-kube-api-access-2jrzs\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.720890 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-image-registry-private-configuration\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.721025 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720904 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-installation-pull-secrets\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.721025 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/54915c0c-e586-4854-9937-807160b46bb8-data-volume\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.721025 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.720989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/54915c0c-e586-4854-9937-807160b46bb8-crio-socket\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.721128 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.721025 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-trusted-ca\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.721128 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.721056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/54915c0c-e586-4854-9937-807160b46bb8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.721128 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.721078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-bound-sa-token\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.721128 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.721084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/54915c0c-e586-4854-9937-807160b46bb8-crio-socket\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.721128 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.721121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/54915c0c-e586-4854-9937-807160b46bb8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.721316 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.721301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/54915c0c-e586-4854-9937-807160b46bb8-data-volume\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.721587 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.721567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/54915c0c-e586-4854-9937-807160b46bb8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.723519 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.723502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/54915c0c-e586-4854-9937-807160b46bb8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.754530 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.754498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqtr4\" (UniqueName: \"kubernetes.io/projected/54915c0c-e586-4854-9937-807160b46bb8-kube-api-access-wqtr4\") pod \"insights-runtime-extractor-8h88w\" (UID: \"54915c0c-e586-4854-9937-807160b46bb8\") " pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:43.822106 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-trusted-ca\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.822106 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-bound-sa-token\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.822315 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-registry-certificates\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.822315 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-ca-trust-extracted\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.822315 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-registry-tls\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.822514 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jrzs\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-kube-api-access-2jrzs\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.822514 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-image-registry-private-configuration\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.822514 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-installation-pull-secrets\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.822733 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.822711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-ca-trust-extracted\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.823055 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.823037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-registry-certificates\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.823145 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.823065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-trusted-ca\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.824960 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.824941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-registry-tls\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.825048 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.824939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-installation-pull-secrets\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.825048 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.825008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-image-registry-private-configuration\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.838667 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.838640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-bound-sa-token\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.839760 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.839736 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jrzs\" (UniqueName: \"kubernetes.io/projected/76638c14-f3f9-4a5a-88fc-a5cd6da627ed-kube-api-access-2jrzs\") pod \"image-registry-9d9f47c87-m5vrb\" (UID: \"76638c14-f3f9-4a5a-88fc-a5cd6da627ed\") " pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:43.881206 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:43.881167 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8h88w" Apr 17 11:18:44.001526 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:44.001490 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8h88w"] Apr 17 11:18:44.004412 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:44.004386 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54915c0c_e586_4854_9937_807160b46bb8.slice/crio-f52f680e3a0ec690471979feddaf676bb3211c0bcae6bbb551174570ac10176f WatchSource:0}: Error finding container f52f680e3a0ec690471979feddaf676bb3211c0bcae6bbb551174570ac10176f: Status 404 returned error can't find the container with id f52f680e3a0ec690471979feddaf676bb3211c0bcae6bbb551174570ac10176f Apr 17 11:18:44.005344 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:44.005329 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:44.172665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:44.172632 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9d9f47c87-m5vrb"] Apr 17 11:18:44.175210 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:44.175185 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76638c14_f3f9_4a5a_88fc_a5cd6da627ed.slice/crio-b1e5999ea1517df76c651477a6e91678b6815a99caebcbbbda274e764fcde72f WatchSource:0}: Error finding container b1e5999ea1517df76c651477a6e91678b6815a99caebcbbbda274e764fcde72f: Status 404 returned error can't find the container with id b1e5999ea1517df76c651477a6e91678b6815a99caebcbbbda274e764fcde72f Apr 17 11:18:44.505619 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:44.505579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" event={"ID":"76638c14-f3f9-4a5a-88fc-a5cd6da627ed","Type":"ContainerStarted","Data":"2085c0a3b50b90be04c43dbe0895dc009d4e0569172b3a1707da0dc8497df3f8"} Apr 17 11:18:44.505799 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:44.505624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" event={"ID":"76638c14-f3f9-4a5a-88fc-a5cd6da627ed","Type":"ContainerStarted","Data":"b1e5999ea1517df76c651477a6e91678b6815a99caebcbbbda274e764fcde72f"} Apr 17 11:18:44.505799 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:44.505721 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:18:44.506854 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:44.506834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8h88w" event={"ID":"54915c0c-e586-4854-9937-807160b46bb8","Type":"ContainerStarted","Data":"9e882b1f551d659ddca459aee7eff5c2ef2c17aec783319470dacbace27faaca"} Apr 17 11:18:44.506953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:44.506858 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8h88w" event={"ID":"54915c0c-e586-4854-9937-807160b46bb8","Type":"ContainerStarted","Data":"f52f680e3a0ec690471979feddaf676bb3211c0bcae6bbb551174570ac10176f"} Apr 17 11:18:45.511579 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:45.511533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8h88w" event={"ID":"54915c0c-e586-4854-9937-807160b46bb8","Type":"ContainerStarted","Data":"df1d8418e7c7ec646f895ad092e39319799dc631e11a95a616a28539d8bc454a"} Apr 17 11:18:46.518218 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:46.518181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8h88w" event={"ID":"54915c0c-e586-4854-9937-807160b46bb8","Type":"ContainerStarted","Data":"327a685c279dcfcb1a5fd1253a2fc984d39e919db297fbd1bb9c43fe46265185"} Apr 17 11:18:46.541335 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:46.541290 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" podStartSLOduration=3.541277206 podStartE2EDuration="3.541277206s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:44.53466919 +0000 UTC m=+153.072591424" watchObservedRunningTime="2026-04-17 11:18:46.541277206 +0000 UTC m=+155.079199439" Apr 17 11:18:46.541676 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:46.541655 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8h88w" podStartSLOduration=1.520419296 podStartE2EDuration="3.541647939s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:44.072646183 +0000 UTC m=+152.610568396" lastFinishedPulling="2026-04-17 11:18:46.093874826 +0000 UTC m=+154.631797039" observedRunningTime="2026-04-17 11:18:46.54123731 +0000 UTC m=+155.079159532" watchObservedRunningTime="2026-04-17 11:18:46.541647939 +0000 UTC m=+155.079570282" Apr 17 11:18:48.340143 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:48.340099 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-crscn" podUID="142cbae1-73ac-4077-9d7f-b3393da4de44" Apr 17 11:18:48.349196 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:48.349169 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kxgxj" podUID="9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb" Apr 17 11:18:48.522949 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:48.522921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:18:48.523061 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:48.522926 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-crscn" Apr 17 11:18:50.076184 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:50.076152 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dn4mx" podUID="ecbf8c24-6e0b-4d26-9530-6bcc59825ca0" Apr 17 11:18:51.664283 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.664241 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k9kpt"] Apr 17 11:18:51.669245 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.669222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.671145 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.671116 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:18:51.671271 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.671198 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2bm54\"" Apr 17 11:18:51.671271 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.671232 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:18:51.671572 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.671557 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:18:51.671870 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.671853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:18:51.672239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.672225 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:18:51.672538 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.672524 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:18:51.685645 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-tls\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.685771 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685661 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-wtmp\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.685771 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-textfile\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.685771 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-root\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.685874 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-sys\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.685874 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d8c1d69-d085-43d0-8ee2-384f6a278430-metrics-client-ca\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.685874 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-accelerators-collector-config\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.685968 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685883 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.685968 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.685913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nkr9\" (UniqueName: \"kubernetes.io/projected/7d8c1d69-d085-43d0-8ee2-384f6a278430-kube-api-access-4nkr9\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786238 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-tls\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786238 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-wtmp\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-textfile\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-root\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-sys\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d8c1d69-d085-43d0-8ee2-384f6a278430-metrics-client-ca\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786383 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-accelerators-collector-config\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:51.786393 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-sys\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786420 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-wtmp\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7d8c1d69-d085-43d0-8ee2-384f6a278430-root\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786511 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:18:51.786482 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-tls podName:7d8c1d69-d085-43d0-8ee2-384f6a278430 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:52.286460046 +0000 UTC m=+160.824382262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-tls") pod "node-exporter-k9kpt" (UID: "7d8c1d69-d085-43d0-8ee2-384f6a278430") : secret "node-exporter-tls" not found Apr 17 11:18:51.786932 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nkr9\" (UniqueName: \"kubernetes.io/projected/7d8c1d69-d085-43d0-8ee2-384f6a278430-kube-api-access-4nkr9\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.786932 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-textfile\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.787006 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.786987 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d8c1d69-d085-43d0-8ee2-384f6a278430-metrics-client-ca\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.788904 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.788887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.788959 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.788938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-accelerators-collector-config\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:51.799876 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:51.799847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nkr9\" (UniqueName: \"kubernetes.io/projected/7d8c1d69-d085-43d0-8ee2-384f6a278430-kube-api-access-4nkr9\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:52.291327 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:52.291283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-tls\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:52.293776 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:52.293747 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7d8c1d69-d085-43d0-8ee2-384f6a278430-node-exporter-tls\") pod \"node-exporter-k9kpt\" (UID: \"7d8c1d69-d085-43d0-8ee2-384f6a278430\") " pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:52.578884 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:52.578781 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k9kpt" Apr 17 11:18:52.587024 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:52.586998 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d8c1d69_d085_43d0_8ee2_384f6a278430.slice/crio-6077387bf930caa26e60ffa159b7866bd5088f3f548148d0de09aa69420cdbeb WatchSource:0}: Error finding container 6077387bf930caa26e60ffa159b7866bd5088f3f548148d0de09aa69420cdbeb: Status 404 returned error can't find the container with id 6077387bf930caa26e60ffa159b7866bd5088f3f548148d0de09aa69420cdbeb Apr 17 11:18:53.300577 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.300539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:18:53.301004 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.300646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:18:53.303257 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.303234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/142cbae1-73ac-4077-9d7f-b3393da4de44-metrics-tls\") pod \"dns-default-crscn\" (UID: \"142cbae1-73ac-4077-9d7f-b3393da4de44\") " pod="openshift-dns/dns-default-crscn" Apr 17 11:18:53.303429 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.303405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb-cert\") pod \"ingress-canary-kxgxj\" (UID: \"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb\") " pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:18:53.326432 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.326403 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bd2vh\"" Apr 17 11:18:53.326874 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.326855 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j8tns\"" Apr 17 11:18:53.334467 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.334439 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-crscn" Apr 17 11:18:53.334599 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.334514 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kxgxj" Apr 17 11:18:53.489447 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.489420 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-crscn"] Apr 17 11:18:53.492219 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:53.492189 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod142cbae1_73ac_4077_9d7f_b3393da4de44.slice/crio-66370b130955024a3c14443b98f062707284f6ab068a25f96649d986105e3f49 WatchSource:0}: Error finding container 66370b130955024a3c14443b98f062707284f6ab068a25f96649d986105e3f49: Status 404 returned error can't find the container with id 66370b130955024a3c14443b98f062707284f6ab068a25f96649d986105e3f49 Apr 17 11:18:53.503863 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.503844 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kxgxj"] Apr 17 11:18:53.506769 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:18:53.506745 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f51a76d_b37b_4ecc_8919_ea7f1f06e2cb.slice/crio-af3d69a0125565de86fed37fc0188990c50bb88dffc5abf90ee31cfba8e9988c WatchSource:0}: Error finding container af3d69a0125565de86fed37fc0188990c50bb88dffc5abf90ee31cfba8e9988c: Status 404 returned error can't find the container with id af3d69a0125565de86fed37fc0188990c50bb88dffc5abf90ee31cfba8e9988c Apr 17 11:18:53.537063 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.537036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-crscn" event={"ID":"142cbae1-73ac-4077-9d7f-b3393da4de44","Type":"ContainerStarted","Data":"66370b130955024a3c14443b98f062707284f6ab068a25f96649d986105e3f49"} Apr 17 11:18:53.538148 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.538124 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kxgxj" event={"ID":"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb","Type":"ContainerStarted","Data":"af3d69a0125565de86fed37fc0188990c50bb88dffc5abf90ee31cfba8e9988c"} Apr 17 11:18:53.539475 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.539455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k9kpt" event={"ID":"7d8c1d69-d085-43d0-8ee2-384f6a278430","Type":"ContainerStarted","Data":"0ad82779c998990572a32e9947ca2502e3a15cab4c4ac0d63f85dd97a4b258b3"} Apr 17 11:18:53.539555 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:53.539482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k9kpt" event={"ID":"7d8c1d69-d085-43d0-8ee2-384f6a278430","Type":"ContainerStarted","Data":"6077387bf930caa26e60ffa159b7866bd5088f3f548148d0de09aa69420cdbeb"} Apr 17 11:18:54.543803 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:54.543766 2577 generic.go:358] "Generic (PLEG): container finished" podID="7d8c1d69-d085-43d0-8ee2-384f6a278430" containerID="0ad82779c998990572a32e9947ca2502e3a15cab4c4ac0d63f85dd97a4b258b3" exitCode=0 Apr 17 11:18:54.544247 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:54.543843 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k9kpt" event={"ID":"7d8c1d69-d085-43d0-8ee2-384f6a278430","Type":"ContainerDied","Data":"0ad82779c998990572a32e9947ca2502e3a15cab4c4ac0d63f85dd97a4b258b3"} Apr 17 11:18:55.548823 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:55.548786 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-crscn" event={"ID":"142cbae1-73ac-4077-9d7f-b3393da4de44","Type":"ContainerStarted","Data":"2ccfdfcb53844dd679c8f062f51bf7af32e78442f660db0676fbba152e37a699"} Apr 17 11:18:55.551857 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:55.551825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kxgxj" event={"ID":"9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb","Type":"ContainerStarted","Data":"90dab0306a796352bfde59d2dba2961676d582f4b79b43fafe98583cce40a763"} Apr 17 11:18:55.557679 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:55.556880 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k9kpt" event={"ID":"7d8c1d69-d085-43d0-8ee2-384f6a278430","Type":"ContainerStarted","Data":"000cba59b7f9d69d738bec6eb5a649a2642ebdea7678e66c26f2cc4392668477"} Apr 17 11:18:55.557679 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:55.556911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k9kpt" event={"ID":"7d8c1d69-d085-43d0-8ee2-384f6a278430","Type":"ContainerStarted","Data":"3e73312fb6b77aa75ab9192274e231452b313381b57c508540593abcfad9ca9a"} Apr 17 11:18:55.569636 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:55.568749 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kxgxj" podStartSLOduration=128.703471745 podStartE2EDuration="2m10.568731277s" podCreationTimestamp="2026-04-17 11:16:45 +0000 UTC" firstStartedPulling="2026-04-17 11:18:53.508348791 +0000 UTC m=+162.046271001" lastFinishedPulling="2026-04-17 11:18:55.373608324 +0000 UTC m=+163.911530533" observedRunningTime="2026-04-17 11:18:55.567633081 +0000 UTC m=+164.105555315" watchObservedRunningTime="2026-04-17 11:18:55.568731277 +0000 UTC m=+164.106653557" Apr 17 11:18:55.589470 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:55.589409 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k9kpt" podStartSLOduration=3.775331826 podStartE2EDuration="4.589390869s" podCreationTimestamp="2026-04-17 11:18:51 +0000 UTC" firstStartedPulling="2026-04-17 11:18:52.588769659 +0000 UTC m=+161.126691868" lastFinishedPulling="2026-04-17 11:18:53.402828695 +0000 UTC m=+161.940750911" observedRunningTime="2026-04-17 11:18:55.587819083 +0000 UTC m=+164.125741317" watchObservedRunningTime="2026-04-17 11:18:55.589390869 +0000 UTC m=+164.127313100" Apr 17 11:18:56.561951 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:56.561917 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-crscn" event={"ID":"142cbae1-73ac-4077-9d7f-b3393da4de44","Type":"ContainerStarted","Data":"f48caefcff16a4f439fd5b92d597215b5d257ebef3b025c648a7e0bf740f4c72"} Apr 17 11:18:56.580637 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:56.580595 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-crscn" podStartSLOduration=129.704624807 podStartE2EDuration="2m11.580582069s" podCreationTimestamp="2026-04-17 11:16:45 +0000 UTC" firstStartedPulling="2026-04-17 11:18:53.493984348 +0000 UTC m=+162.031906558" lastFinishedPulling="2026-04-17 11:18:55.369941595 +0000 UTC m=+163.907863820" observedRunningTime="2026-04-17 11:18:56.579941471 +0000 UTC m=+165.117863703" watchObservedRunningTime="2026-04-17 11:18:56.580582069 +0000 UTC m=+165.118504300" Apr 17 11:18:57.565565 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:18:57.565532 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-crscn" Apr 17 11:19:01.578549 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:01.578513 2577 generic.go:358] "Generic (PLEG): container finished" podID="4f7a314a-b849-4a84-a9ba-c8fd75094d28" containerID="923cd1d0129ebc7cdc08eb63f35c7afc072faa44603aff8286c22815b3008415" exitCode=255 Apr 17 11:19:01.579011 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:01.578576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" event={"ID":"4f7a314a-b849-4a84-a9ba-c8fd75094d28","Type":"ContainerDied","Data":"923cd1d0129ebc7cdc08eb63f35c7afc072faa44603aff8286c22815b3008415"} Apr 17 11:19:01.584118 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:01.584099 2577 scope.go:117] "RemoveContainer" containerID="923cd1d0129ebc7cdc08eb63f35c7afc072faa44603aff8286c22815b3008415" Apr 17 11:19:02.582601 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:02.582561 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65494997f7-8zfdv" event={"ID":"4f7a314a-b849-4a84-a9ba-c8fd75094d28","Type":"ContainerStarted","Data":"8fa2f4054e8e71ea111bf6425b24299a94bf464fb9bd74058df76dca0cf920e1"} Apr 17 11:19:03.643839 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:03.643803 2577 patch_prober.go:28] interesting pod/image-registry-b7c457569-wmbwt container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:19:03.644205 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:03.643859 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" podUID="abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:19:04.009846 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:04.009815 2577 patch_prober.go:28] interesting pod/image-registry-9d9f47c87-m5vrb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:19:04.010015 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:04.009866 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" podUID="76638c14-f3f9-4a5a-88fc-a5cd6da627ed" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:19:04.061322 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:04.061286 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:19:05.516015 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:05.515989 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-9d9f47c87-m5vrb" Apr 17 11:19:07.571058 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:07.571030 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-crscn" Apr 17 11:19:08.657292 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:08.657234 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" podUID="abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" containerName="registry" containerID="cri-o://e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253" gracePeriod=30 Apr 17 11:19:08.890318 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:08.890295 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:19:09.030735 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.030698 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-installation-pull-secrets\") pod \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " Apr 17 11:19:09.030913 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.030744 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") pod \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " Apr 17 11:19:09.030913 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.030772 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-image-registry-private-configuration\") pod \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " Apr 17 11:19:09.030913 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.030813 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz4d2\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-kube-api-access-nz4d2\") pod \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " Apr 17 11:19:09.030913 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.030840 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-ca-trust-extracted\") pod \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " Apr 17 11:19:09.030913 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.030870 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-trusted-ca\") pod \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " Apr 17 11:19:09.030913 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.030902 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-certificates\") pod \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " Apr 17 11:19:09.031183 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.030942 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-bound-sa-token\") pod \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\" (UID: \"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84\") " Apr 17 11:19:09.031562 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.031471 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:09.031562 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.031512 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:09.031922 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.031654 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-trusted-ca\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:09.031922 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.031675 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-certificates\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:09.033533 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.033502 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:09.033649 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.033568 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:09.033649 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.033633 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:09.033739 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.033642 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:09.033739 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.033692 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-kube-api-access-nz4d2" (OuterVolumeSpecName: "kube-api-access-nz4d2") pod "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84"). InnerVolumeSpecName "kube-api-access-nz4d2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:09.039687 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.039662 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" (UID: "abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:19:09.132038 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.131996 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nz4d2\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-kube-api-access-nz4d2\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:09.132038 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.132035 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-ca-trust-extracted\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:09.132038 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.132047 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-bound-sa-token\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:09.132038 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.132056 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-installation-pull-secrets\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:09.132278 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.132064 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-registry-tls\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:09.132278 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.132074 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84-image-registry-private-configuration\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:09.600854 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.600814 2577 generic.go:358] "Generic (PLEG): container finished" podID="abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" containerID="e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253" exitCode=0 Apr 17 11:19:09.601024 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.600903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" event={"ID":"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84","Type":"ContainerDied","Data":"e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253"} Apr 17 11:19:09.601024 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.600937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" event={"ID":"abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84","Type":"ContainerDied","Data":"1d144e47c2bfa228296bb453ea7b048dc9603b58f81da36f94cfe533310894d5"} Apr 17 11:19:09.601024 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.600953 2577 scope.go:117] "RemoveContainer" containerID="e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253" Apr 17 11:19:09.601024 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.600905 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b7c457569-wmbwt" Apr 17 11:19:09.609260 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.609237 2577 scope.go:117] "RemoveContainer" containerID="e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253" Apr 17 11:19:09.609564 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:19:09.609544 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253\": container with ID starting with e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253 not found: ID does not exist" containerID="e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253" Apr 17 11:19:09.609619 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.609573 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253"} err="failed to get container status \"e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253\": rpc error: code = NotFound desc = could not find container \"e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253\": container with ID starting with e248a7598b549a2bd1616ca51e44bd6fba85695057f8fdbefdd83ccf0e579253 not found: ID does not exist" Apr 17 11:19:09.620477 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.620452 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b7c457569-wmbwt"] Apr 17 11:19:09.623974 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:09.623953 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-b7c457569-wmbwt"] Apr 17 11:19:10.065005 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:10.064978 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" path="/var/lib/kubelet/pods/abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84/volumes" Apr 17 11:19:12.552359 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.552318 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c6d6f7485-8c7ld"] Apr 17 11:19:12.552909 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.552889 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" containerName="registry" Apr 17 11:19:12.552960 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.552913 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" containerName="registry" Apr 17 11:19:12.553031 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.553019 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="abb2cb8a-baca-4bf5-a65c-20d8f7e8ad84" containerName="registry" Apr 17 11:19:12.558615 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.558587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.561882 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.561855 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 11:19:12.561986 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.561907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 11:19:12.562045 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.562030 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 11:19:12.562179 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.562162 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 11:19:12.562247 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.562213 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-v5mpt\"" Apr 17 11:19:12.562296 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.562269 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 11:19:12.562296 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.562279 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 11:19:12.562429 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.562276 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 11:19:12.569863 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.569841 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c6d6f7485-8c7ld"] Apr 17 11:19:12.659190 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.659154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-service-ca\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.659190 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.659194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-serving-cert\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.659500 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.659212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-oauth-config\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.659500 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.659233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-config\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.659500 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.659390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-oauth-serving-cert\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.659500 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.659443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmpx\" (UniqueName: \"kubernetes.io/projected/7bfca5d7-caff-4674-be9f-b3f852f634d4-kube-api-access-nsmpx\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.760192 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.760156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-service-ca\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.760313 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.760199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-serving-cert\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.760313 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.760216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-oauth-config\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.760313 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.760236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-config\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.760313 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.760264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-oauth-serving-cert\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.760313 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.760293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmpx\" (UniqueName: \"kubernetes.io/projected/7bfca5d7-caff-4674-be9f-b3f852f634d4-kube-api-access-nsmpx\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.760944 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.760919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-service-ca\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.761038 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.760965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-config\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.761256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.761234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-oauth-serving-cert\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.762850 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.762829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-serving-cert\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.762942 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.762911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-oauth-config\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.769638 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.769614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmpx\" (UniqueName: \"kubernetes.io/projected/7bfca5d7-caff-4674-be9f-b3f852f634d4-kube-api-access-nsmpx\") pod \"console-7c6d6f7485-8c7ld\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.867849 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.867767 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:12.987348 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:12.987315 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c6d6f7485-8c7ld"] Apr 17 11:19:12.990540 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:19:12.990514 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfca5d7_caff_4674_be9f_b3f852f634d4.slice/crio-df5b882ab0e13bc789be7ba9ec7dff9ac543cd5f3445f67fca80ca0236d6391a WatchSource:0}: Error finding container df5b882ab0e13bc789be7ba9ec7dff9ac543cd5f3445f67fca80ca0236d6391a: Status 404 returned error can't find the container with id df5b882ab0e13bc789be7ba9ec7dff9ac543cd5f3445f67fca80ca0236d6391a Apr 17 11:19:13.612803 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:13.612769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c6d6f7485-8c7ld" event={"ID":"7bfca5d7-caff-4674-be9f-b3f852f634d4","Type":"ContainerStarted","Data":"df5b882ab0e13bc789be7ba9ec7dff9ac543cd5f3445f67fca80ca0236d6391a"} Apr 17 11:19:16.621707 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:16.621661 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c6d6f7485-8c7ld" event={"ID":"7bfca5d7-caff-4674-be9f-b3f852f634d4","Type":"ContainerStarted","Data":"95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388"} Apr 17 11:19:17.145448 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.145394 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c6d6f7485-8c7ld" podStartSLOduration=2.326161597 podStartE2EDuration="5.145350487s" podCreationTimestamp="2026-04-17 11:19:12 +0000 UTC" firstStartedPulling="2026-04-17 11:19:12.992469623 +0000 UTC m=+181.530391833" lastFinishedPulling="2026-04-17 11:19:15.8116585 +0000 UTC m=+184.349580723" observedRunningTime="2026-04-17 11:19:16.641954179 +0000 UTC m=+185.179876411" watchObservedRunningTime="2026-04-17 11:19:17.145350487 +0000 UTC m=+185.683272719" Apr 17 11:19:17.146494 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.146474 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6745f654fc-d4pv8"] Apr 17 11:19:17.152002 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.151986 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.165600 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.165576 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 11:19:17.170498 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.170471 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6745f654fc-d4pv8"] Apr 17 11:19:17.296741 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.296710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-serving-cert\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.296741 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.296744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-oauth-config\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.296955 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.296770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-oauth-serving-cert\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.296955 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.296792 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-service-ca\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.296955 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.296850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-console-config\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.296955 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.296866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dmf\" (UniqueName: \"kubernetes.io/projected/7aa74da3-7390-4843-a463-28d8e63e460a-kube-api-access-27dmf\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.296955 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.296936 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-trusted-ca-bundle\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.398329 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.398243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-serving-cert\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.398329 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.398282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-oauth-config\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.398329 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.398307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-oauth-serving-cert\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.398634 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.398334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-service-ca\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.398634 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.398391 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-console-config\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.398634 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.398417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27dmf\" (UniqueName: \"kubernetes.io/projected/7aa74da3-7390-4843-a463-28d8e63e460a-kube-api-access-27dmf\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.398634 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.398464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-trusted-ca-bundle\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.399108 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.399071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-service-ca\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.399241 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.399218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-oauth-serving-cert\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.399310 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.399295 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-trusted-ca-bundle\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.399405 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.399357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-console-config\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.400926 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.400903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-oauth-config\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.401091 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.401073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-serving-cert\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.408020 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.407996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dmf\" (UniqueName: \"kubernetes.io/projected/7aa74da3-7390-4843-a463-28d8e63e460a-kube-api-access-27dmf\") pod \"console-6745f654fc-d4pv8\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.463551 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.463516 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:17.582470 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.582438 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6745f654fc-d4pv8"] Apr 17 11:19:17.585355 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:19:17.585319 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa74da3_7390_4843_a463_28d8e63e460a.slice/crio-e00034849b562bf529444c6df2c7d6b8b936a931f2c6d4b1a85e3d73c4afa902 WatchSource:0}: Error finding container e00034849b562bf529444c6df2c7d6b8b936a931f2c6d4b1a85e3d73c4afa902: Status 404 returned error can't find the container with id e00034849b562bf529444c6df2c7d6b8b936a931f2c6d4b1a85e3d73c4afa902 Apr 17 11:19:17.625378 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:17.625330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6745f654fc-d4pv8" event={"ID":"7aa74da3-7390-4843-a463-28d8e63e460a","Type":"ContainerStarted","Data":"e00034849b562bf529444c6df2c7d6b8b936a931f2c6d4b1a85e3d73c4afa902"} Apr 17 11:19:18.629903 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:18.629875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6745f654fc-d4pv8" event={"ID":"7aa74da3-7390-4843-a463-28d8e63e460a","Type":"ContainerStarted","Data":"7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2"} Apr 17 11:19:18.650731 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:18.650687 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6745f654fc-d4pv8" podStartSLOduration=1.650672933 podStartE2EDuration="1.650672933s" podCreationTimestamp="2026-04-17 11:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:19:18.649663015 +0000 UTC m=+187.187585268" watchObservedRunningTime="2026-04-17 11:19:18.650672933 +0000 UTC m=+187.188595165" Apr 17 11:19:22.868143 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:22.868099 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:22.868630 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:22.868188 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:22.873262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:22.873241 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:23.648605 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:23.648578 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:27.464115 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:27.464083 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:27.464707 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:27.464127 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:27.468698 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:27.468677 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:27.659083 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:27.659060 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:19:27.709271 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:27.709242 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c6d6f7485-8c7ld"] Apr 17 11:19:51.721841 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:51.721799 2577 generic.go:358] "Generic (PLEG): container finished" podID="9e8e8dee-ad5b-4371-84f8-4e123925013a" containerID="d5fbb67ea1a5bc848c3c561cc3eacd3c68ae9ac343aa1c39b8c0a1e13c442dec" exitCode=0 Apr 17 11:19:51.722454 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:51.722430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" event={"ID":"9e8e8dee-ad5b-4371-84f8-4e123925013a","Type":"ContainerDied","Data":"d5fbb67ea1a5bc848c3c561cc3eacd3c68ae9ac343aa1c39b8c0a1e13c442dec"} Apr 17 11:19:51.723020 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:51.723001 2577 scope.go:117] "RemoveContainer" containerID="d5fbb67ea1a5bc848c3c561cc3eacd3c68ae9ac343aa1c39b8c0a1e13c442dec" Apr 17 11:19:51.726998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:51.726976 2577 generic.go:358] "Generic (PLEG): container finished" podID="241708e9-6c54-4758-aa09-fa52e406c967" containerID="6416f40b9510bc7f9c09933eb63d903be2d6098109bb52257fb180e788bd2292" exitCode=0 Apr 17 11:19:51.727065 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:51.727028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" event={"ID":"241708e9-6c54-4758-aa09-fa52e406c967","Type":"ContainerDied","Data":"6416f40b9510bc7f9c09933eb63d903be2d6098109bb52257fb180e788bd2292"} Apr 17 11:19:51.727348 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:51.727332 2577 scope.go:117] "RemoveContainer" containerID="6416f40b9510bc7f9c09933eb63d903be2d6098109bb52257fb180e788bd2292" Apr 17 11:19:52.729023 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:52.728990 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c6d6f7485-8c7ld" podUID="7bfca5d7-caff-4674-be9f-b3f852f634d4" containerName="console" containerID="cri-o://95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388" gracePeriod=15 Apr 17 11:19:52.731522 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:52.731497 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-45hcq" event={"ID":"9e8e8dee-ad5b-4371-84f8-4e123925013a","Type":"ContainerStarted","Data":"8aef389dcde02e6bd0ebafb3452cd559bd21f782db0f32964ad7641037a89000"} Apr 17 11:19:52.733270 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:52.733248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6sx6h" event={"ID":"241708e9-6c54-4758-aa09-fa52e406c967","Type":"ContainerStarted","Data":"a8255fcdfcf2b5a6cc26ae2486e5f800e47f0f497f64108a728be1faa6a2ac1a"} Apr 17 11:19:52.976314 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:52.976292 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c6d6f7485-8c7ld_7bfca5d7-caff-4674-be9f-b3f852f634d4/console/0.log" Apr 17 11:19:52.976466 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:52.976383 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:53.083584 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.083480 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-oauth-config\") pod \"7bfca5d7-caff-4674-be9f-b3f852f634d4\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " Apr 17 11:19:53.083584 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.083533 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-serving-cert\") pod \"7bfca5d7-caff-4674-be9f-b3f852f634d4\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " Apr 17 11:19:53.083584 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.083570 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsmpx\" (UniqueName: \"kubernetes.io/projected/7bfca5d7-caff-4674-be9f-b3f852f634d4-kube-api-access-nsmpx\") pod \"7bfca5d7-caff-4674-be9f-b3f852f634d4\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " Apr 17 11:19:53.083863 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.083597 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-config\") pod \"7bfca5d7-caff-4674-be9f-b3f852f634d4\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " Apr 17 11:19:53.083863 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.083622 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-oauth-serving-cert\") pod \"7bfca5d7-caff-4674-be9f-b3f852f634d4\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " Apr 17 11:19:53.083863 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.083649 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-service-ca\") pod \"7bfca5d7-caff-4674-be9f-b3f852f634d4\" (UID: \"7bfca5d7-caff-4674-be9f-b3f852f634d4\") " Apr 17 11:19:53.084062 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.084030 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-config" (OuterVolumeSpecName: "console-config") pod "7bfca5d7-caff-4674-be9f-b3f852f634d4" (UID: "7bfca5d7-caff-4674-be9f-b3f852f634d4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:53.084062 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.084043 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7bfca5d7-caff-4674-be9f-b3f852f634d4" (UID: "7bfca5d7-caff-4674-be9f-b3f852f634d4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:53.084165 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.084128 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-service-ca" (OuterVolumeSpecName: "service-ca") pod "7bfca5d7-caff-4674-be9f-b3f852f634d4" (UID: "7bfca5d7-caff-4674-be9f-b3f852f634d4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:53.086095 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.086072 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7bfca5d7-caff-4674-be9f-b3f852f634d4" (UID: "7bfca5d7-caff-4674-be9f-b3f852f634d4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:53.086200 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.086106 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7bfca5d7-caff-4674-be9f-b3f852f634d4" (UID: "7bfca5d7-caff-4674-be9f-b3f852f634d4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:53.086200 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.086171 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfca5d7-caff-4674-be9f-b3f852f634d4-kube-api-access-nsmpx" (OuterVolumeSpecName: "kube-api-access-nsmpx") pod "7bfca5d7-caff-4674-be9f-b3f852f634d4" (UID: "7bfca5d7-caff-4674-be9f-b3f852f634d4"). InnerVolumeSpecName "kube-api-access-nsmpx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:53.185168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.185129 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-serving-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:53.185168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.185160 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsmpx\" (UniqueName: \"kubernetes.io/projected/7bfca5d7-caff-4674-be9f-b3f852f634d4-kube-api-access-nsmpx\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:53.185168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.185170 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-config\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:53.185425 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.185180 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-oauth-serving-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:53.185425 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.185189 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bfca5d7-caff-4674-be9f-b3f852f634d4-service-ca\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:53.185425 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.185197 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bfca5d7-caff-4674-be9f-b3f852f634d4-console-oauth-config\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:19:53.737060 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.737033 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c6d6f7485-8c7ld_7bfca5d7-caff-4674-be9f-b3f852f634d4/console/0.log" Apr 17 11:19:53.737480 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.737077 2577 generic.go:358] "Generic (PLEG): container finished" podID="7bfca5d7-caff-4674-be9f-b3f852f634d4" containerID="95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388" exitCode=2 Apr 17 11:19:53.737480 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.737110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c6d6f7485-8c7ld" event={"ID":"7bfca5d7-caff-4674-be9f-b3f852f634d4","Type":"ContainerDied","Data":"95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388"} Apr 17 11:19:53.737480 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.737151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c6d6f7485-8c7ld" event={"ID":"7bfca5d7-caff-4674-be9f-b3f852f634d4","Type":"ContainerDied","Data":"df5b882ab0e13bc789be7ba9ec7dff9ac543cd5f3445f67fca80ca0236d6391a"} Apr 17 11:19:53.737480 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.737150 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c6d6f7485-8c7ld" Apr 17 11:19:53.737480 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.737163 2577 scope.go:117] "RemoveContainer" containerID="95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388" Apr 17 11:19:53.745142 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.745127 2577 scope.go:117] "RemoveContainer" containerID="95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388" Apr 17 11:19:53.745388 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:19:53.745345 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388\": container with ID starting with 95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388 not found: ID does not exist" containerID="95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388" Apr 17 11:19:53.745447 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.745387 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388"} err="failed to get container status \"95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388\": rpc error: code = NotFound desc = could not find container \"95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388\": container with ID starting with 95bd26f72070b93c4df5247c4870a9d0a825ea782d181281fe4006b16c154388 not found: ID does not exist" Apr 17 11:19:53.756585 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.756558 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c6d6f7485-8c7ld"] Apr 17 11:19:53.760548 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:53.760530 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c6d6f7485-8c7ld"] Apr 17 11:19:54.068599 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:19:54.066920 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bfca5d7-caff-4674-be9f-b3f852f634d4" path="/var/lib/kubelet/pods/7bfca5d7-caff-4674-be9f-b3f852f634d4/volumes" Apr 17 11:20:19.370429 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.370394 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b87dffd84-gnzrg"] Apr 17 11:20:19.370838 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.370654 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bfca5d7-caff-4674-be9f-b3f852f634d4" containerName="console" Apr 17 11:20:19.370838 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.370665 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bfca5d7-caff-4674-be9f-b3f852f634d4" containerName="console" Apr 17 11:20:19.370838 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.370712 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bfca5d7-caff-4674-be9f-b3f852f634d4" containerName="console" Apr 17 11:20:19.374900 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.374883 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.397269 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.397245 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b87dffd84-gnzrg"] Apr 17 11:20:19.484182 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.484143 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-oauth-serving-cert\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.484341 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.484231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-service-ca\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.484341 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.484284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-serving-cert\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.484341 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.484309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-oauth-config\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.484481 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.484355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-trusted-ca-bundle\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.484481 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.484398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j2f4\" (UniqueName: \"kubernetes.io/projected/9f5bdbe0-58a2-4835-a617-68a7443f80a6-kube-api-access-8j2f4\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.484481 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.484426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-config\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.585512 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.585476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-trusted-ca-bundle\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.585512 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.585518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j2f4\" (UniqueName: \"kubernetes.io/projected/9f5bdbe0-58a2-4835-a617-68a7443f80a6-kube-api-access-8j2f4\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.585772 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.585541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-config\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.585772 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.585590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-oauth-serving-cert\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.585772 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.585609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-service-ca\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.585772 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.585639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-serving-cert\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.585772 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.585665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-oauth-config\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.586450 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.586425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-oauth-serving-cert\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.586578 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.586552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-service-ca\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.586622 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.586552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-config\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.586683 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.586660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-trusted-ca-bundle\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.588190 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.588163 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-serving-cert\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.588289 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.588268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-oauth-config\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.595922 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.595890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j2f4\" (UniqueName: \"kubernetes.io/projected/9f5bdbe0-58a2-4835-a617-68a7443f80a6-kube-api-access-8j2f4\") pod \"console-6b87dffd84-gnzrg\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.684620 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.684532 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:19.806947 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:19.806922 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b87dffd84-gnzrg"] Apr 17 11:20:19.809661 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:20:19.809635 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5bdbe0_58a2_4835_a617_68a7443f80a6.slice/crio-51da4d2e39eab37a77b46e70f0186eae61a62370c69971f514e0badc84b9476a WatchSource:0}: Error finding container 51da4d2e39eab37a77b46e70f0186eae61a62370c69971f514e0badc84b9476a: Status 404 returned error can't find the container with id 51da4d2e39eab37a77b46e70f0186eae61a62370c69971f514e0badc84b9476a Apr 17 11:20:20.811858 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:20.811815 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b87dffd84-gnzrg" event={"ID":"9f5bdbe0-58a2-4835-a617-68a7443f80a6","Type":"ContainerStarted","Data":"7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714"} Apr 17 11:20:20.811858 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:20.811851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b87dffd84-gnzrg" event={"ID":"9f5bdbe0-58a2-4835-a617-68a7443f80a6","Type":"ContainerStarted","Data":"51da4d2e39eab37a77b46e70f0186eae61a62370c69971f514e0badc84b9476a"} Apr 17 11:20:20.830559 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:20.830516 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b87dffd84-gnzrg" podStartSLOduration=1.8305029830000001 podStartE2EDuration="1.830502983s" podCreationTimestamp="2026-04-17 11:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:20.829953893 +0000 UTC m=+249.367876126" watchObservedRunningTime="2026-04-17 11:20:20.830502983 +0000 UTC m=+249.368425215" Apr 17 11:20:23.819644 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:23.819611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:20:23.821957 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:23.821939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ecbf8c24-6e0b-4d26-9530-6bcc59825ca0-metrics-certs\") pod \"network-metrics-daemon-dn4mx\" (UID: \"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0\") " pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:20:23.863978 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:23.863950 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-g9l5h\"" Apr 17 11:20:23.871728 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:23.871689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dn4mx" Apr 17 11:20:23.989622 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:23.989594 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dn4mx"] Apr 17 11:20:23.992896 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:20:23.992872 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecbf8c24_6e0b_4d26_9530_6bcc59825ca0.slice/crio-6c005bdb70b9c4d472d28e9c37ffee907521405bdcc30ba07c58296118d1f3ba WatchSource:0}: Error finding container 6c005bdb70b9c4d472d28e9c37ffee907521405bdcc30ba07c58296118d1f3ba: Status 404 returned error can't find the container with id 6c005bdb70b9c4d472d28e9c37ffee907521405bdcc30ba07c58296118d1f3ba Apr 17 11:20:24.822080 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:24.822046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dn4mx" event={"ID":"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0","Type":"ContainerStarted","Data":"6c005bdb70b9c4d472d28e9c37ffee907521405bdcc30ba07c58296118d1f3ba"} Apr 17 11:20:25.828934 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:25.828898 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dn4mx" event={"ID":"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0","Type":"ContainerStarted","Data":"4833b1948868f3b496c8c0a2a444d86287ad92aa98d2ff7f386340999f1f7a94"} Apr 17 11:20:25.828934 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:25.828933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dn4mx" event={"ID":"ecbf8c24-6e0b-4d26-9530-6bcc59825ca0","Type":"ContainerStarted","Data":"ba82b2429936a95877132571aec3bf73eea92a1ec6f551f6b6bf472e0da501a9"} Apr 17 11:20:25.848558 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:25.848509 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dn4mx" podStartSLOduration=252.975489287 podStartE2EDuration="4m13.848495437s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:20:23.994712686 +0000 UTC m=+252.532634911" lastFinishedPulling="2026-04-17 11:20:24.867718834 +0000 UTC m=+253.405641061" observedRunningTime="2026-04-17 11:20:25.847280274 +0000 UTC m=+254.385202506" watchObservedRunningTime="2026-04-17 11:20:25.848495437 +0000 UTC m=+254.386417663" Apr 17 11:20:29.685539 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:29.685442 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:29.685539 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:29.685497 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:29.690238 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:29.690216 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:29.842094 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:29.842070 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:20:29.894887 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:29.894853 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6745f654fc-d4pv8"] Apr 17 11:20:54.920073 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:54.920009 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6745f654fc-d4pv8" podUID="7aa74da3-7390-4843-a463-28d8e63e460a" containerName="console" containerID="cri-o://7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2" gracePeriod=15 Apr 17 11:20:55.165065 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.165040 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6745f654fc-d4pv8_7aa74da3-7390-4843-a463-28d8e63e460a/console/0.log" Apr 17 11:20:55.165174 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.165100 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:20:55.358952 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.358897 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-service-ca\") pod \"7aa74da3-7390-4843-a463-28d8e63e460a\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " Apr 17 11:20:55.358952 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.358960 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-trusted-ca-bundle\") pod \"7aa74da3-7390-4843-a463-28d8e63e460a\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " Apr 17 11:20:55.359235 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.358990 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-oauth-serving-cert\") pod \"7aa74da3-7390-4843-a463-28d8e63e460a\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " Apr 17 11:20:55.359235 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.359014 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-console-config\") pod \"7aa74da3-7390-4843-a463-28d8e63e460a\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " Apr 17 11:20:55.359235 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.359036 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27dmf\" (UniqueName: \"kubernetes.io/projected/7aa74da3-7390-4843-a463-28d8e63e460a-kube-api-access-27dmf\") pod \"7aa74da3-7390-4843-a463-28d8e63e460a\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " Apr 17 11:20:55.359235 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.359075 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-serving-cert\") pod \"7aa74da3-7390-4843-a463-28d8e63e460a\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " Apr 17 11:20:55.359235 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.359134 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-oauth-config\") pod \"7aa74da3-7390-4843-a463-28d8e63e460a\" (UID: \"7aa74da3-7390-4843-a463-28d8e63e460a\") " Apr 17 11:20:55.359509 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.359435 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-service-ca" (OuterVolumeSpecName: "service-ca") pod "7aa74da3-7390-4843-a463-28d8e63e460a" (UID: "7aa74da3-7390-4843-a463-28d8e63e460a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:55.359509 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.359455 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7aa74da3-7390-4843-a463-28d8e63e460a" (UID: "7aa74da3-7390-4843-a463-28d8e63e460a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:55.359509 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.359479 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-console-config" (OuterVolumeSpecName: "console-config") pod "7aa74da3-7390-4843-a463-28d8e63e460a" (UID: "7aa74da3-7390-4843-a463-28d8e63e460a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:55.359663 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.359524 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7aa74da3-7390-4843-a463-28d8e63e460a" (UID: "7aa74da3-7390-4843-a463-28d8e63e460a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:55.361432 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.361404 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7aa74da3-7390-4843-a463-28d8e63e460a" (UID: "7aa74da3-7390-4843-a463-28d8e63e460a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:55.361531 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.361494 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7aa74da3-7390-4843-a463-28d8e63e460a" (UID: "7aa74da3-7390-4843-a463-28d8e63e460a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:55.361531 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.361500 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa74da3-7390-4843-a463-28d8e63e460a-kube-api-access-27dmf" (OuterVolumeSpecName: "kube-api-access-27dmf") pod "7aa74da3-7390-4843-a463-28d8e63e460a" (UID: "7aa74da3-7390-4843-a463-28d8e63e460a"). InnerVolumeSpecName "kube-api-access-27dmf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:55.460302 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.460270 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-service-ca\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:20:55.460302 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.460297 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-trusted-ca-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:20:55.460302 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.460308 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-oauth-serving-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:20:55.460540 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.460317 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7aa74da3-7390-4843-a463-28d8e63e460a-console-config\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:20:55.460540 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.460326 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-27dmf\" (UniqueName: \"kubernetes.io/projected/7aa74da3-7390-4843-a463-28d8e63e460a-kube-api-access-27dmf\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:20:55.460540 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.460335 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-serving-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:20:55.460540 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.460344 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7aa74da3-7390-4843-a463-28d8e63e460a-console-oauth-config\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:20:55.909506 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.909480 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6745f654fc-d4pv8_7aa74da3-7390-4843-a463-28d8e63e460a/console/0.log" Apr 17 11:20:55.909705 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.909519 2577 generic.go:358] "Generic (PLEG): container finished" podID="7aa74da3-7390-4843-a463-28d8e63e460a" containerID="7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2" exitCode=2 Apr 17 11:20:55.909705 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.909587 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6745f654fc-d4pv8" event={"ID":"7aa74da3-7390-4843-a463-28d8e63e460a","Type":"ContainerDied","Data":"7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2"} Apr 17 11:20:55.909705 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.909601 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6745f654fc-d4pv8" Apr 17 11:20:55.909705 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.909619 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6745f654fc-d4pv8" event={"ID":"7aa74da3-7390-4843-a463-28d8e63e460a","Type":"ContainerDied","Data":"e00034849b562bf529444c6df2c7d6b8b936a931f2c6d4b1a85e3d73c4afa902"} Apr 17 11:20:55.909705 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.909635 2577 scope.go:117] "RemoveContainer" containerID="7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2" Apr 17 11:20:55.920165 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.920143 2577 scope.go:117] "RemoveContainer" containerID="7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2" Apr 17 11:20:55.920546 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:20:55.920524 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2\": container with ID starting with 7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2 not found: ID does not exist" containerID="7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2" Apr 17 11:20:55.920605 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.920553 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2"} err="failed to get container status \"7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2\": rpc error: code = NotFound desc = could not find container \"7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2\": container with ID starting with 7456cde64c830e9d7d4fe6cb968dd5d01e565bd0022dc91ac93f56c50017fcf2 not found: ID does not exist" Apr 17 11:20:55.934006 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.933976 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6745f654fc-d4pv8"] Apr 17 11:20:55.939270 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:55.939246 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6745f654fc-d4pv8"] Apr 17 11:20:56.065257 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:20:56.065224 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa74da3-7390-4843-a463-28d8e63e460a" path="/var/lib/kubelet/pods/7aa74da3-7390-4843-a463-28d8e63e460a/volumes" Apr 17 11:21:11.915951 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:11.915909 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:21:11.920466 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:11.917575 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:21:11.926333 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:11.926253 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:21:26.856492 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.856455 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-594f5485b8-z4tx4"] Apr 17 11:21:26.858228 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.856737 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7aa74da3-7390-4843-a463-28d8e63e460a" containerName="console" Apr 17 11:21:26.858228 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.856749 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa74da3-7390-4843-a463-28d8e63e460a" containerName="console" Apr 17 11:21:26.858228 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.856796 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7aa74da3-7390-4843-a463-28d8e63e460a" containerName="console" Apr 17 11:21:26.858798 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.858780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.872934 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.872904 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-594f5485b8-z4tx4"] Apr 17 11:21:26.878527 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.878498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-serving-cert\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.878665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.878534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-oauth-config\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.878665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.878554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-trusted-ca-bundle\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.878665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.878636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgn55\" (UniqueName: \"kubernetes.io/projected/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-kube-api-access-pgn55\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.878828 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.878720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-service-ca\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.878828 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.878761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-oauth-serving-cert\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.878828 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.878803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-config\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.979676 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.979642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-serving-cert\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.979676 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.979687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-oauth-config\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.979874 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.979715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-trusted-ca-bundle\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.979874 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.979834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgn55\" (UniqueName: \"kubernetes.io/projected/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-kube-api-access-pgn55\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.979942 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.979899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-service-ca\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.979942 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.979930 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-oauth-serving-cert\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.980019 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.979971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-config\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.980688 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.980654 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-service-ca\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.980688 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.980677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-trusted-ca-bundle\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.980885 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.980832 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-oauth-serving-cert\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.980969 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.980947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-config\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.982285 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.982261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-serving-cert\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.982394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.982299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-oauth-config\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:26.989475 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:26.989457 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgn55\" (UniqueName: \"kubernetes.io/projected/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-kube-api-access-pgn55\") pod \"console-594f5485b8-z4tx4\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:27.168001 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:27.167917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:27.296609 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:27.296579 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-594f5485b8-z4tx4"] Apr 17 11:21:27.300079 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:21:27.300053 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd97aa1_6d9e_4f68_ad38_7aea6147d061.slice/crio-14b9394be3ea16e3e3afa60ba91c661ff3ddbfd84489fd84383c5a7884303488 WatchSource:0}: Error finding container 14b9394be3ea16e3e3afa60ba91c661ff3ddbfd84489fd84383c5a7884303488: Status 404 returned error can't find the container with id 14b9394be3ea16e3e3afa60ba91c661ff3ddbfd84489fd84383c5a7884303488 Apr 17 11:21:27.301755 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:27.301739 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:21:28.004346 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:28.004314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-594f5485b8-z4tx4" event={"ID":"3dd97aa1-6d9e-4f68-ad38-7aea6147d061","Type":"ContainerStarted","Data":"b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d"} Apr 17 11:21:28.004346 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:28.004348 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-594f5485b8-z4tx4" event={"ID":"3dd97aa1-6d9e-4f68-ad38-7aea6147d061","Type":"ContainerStarted","Data":"14b9394be3ea16e3e3afa60ba91c661ff3ddbfd84489fd84383c5a7884303488"} Apr 17 11:21:28.021283 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:28.021235 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-594f5485b8-z4tx4" podStartSLOduration=2.021219665 podStartE2EDuration="2.021219665s" podCreationTimestamp="2026-04-17 11:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:21:28.020153581 +0000 UTC m=+316.558075825" watchObservedRunningTime="2026-04-17 11:21:28.021219665 +0000 UTC m=+316.559141900" Apr 17 11:21:37.168832 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:37.168792 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:37.168832 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:37.168838 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:37.173616 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:37.173592 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:38.039021 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:38.038992 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:21:38.083286 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:21:38.083251 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b87dffd84-gnzrg"] Apr 17 11:22:03.110008 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.109906 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b87dffd84-gnzrg" podUID="9f5bdbe0-58a2-4835-a617-68a7443f80a6" containerName="console" containerID="cri-o://7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714" gracePeriod=15 Apr 17 11:22:03.185430 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.185393 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-87nfm"] Apr 17 11:22:03.188249 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.188226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.190607 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.190585 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:22:03.196779 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.196752 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-87nfm"] Apr 17 11:22:03.342390 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.342343 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b87dffd84-gnzrg_9f5bdbe0-58a2-4835-a617-68a7443f80a6/console/0.log" Apr 17 11:22:03.342503 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.342423 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:22:03.364284 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.364201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3715a6cc-4533-4b7c-b268-85902d84afd1-kubelet-config\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.364284 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.364231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3715a6cc-4533-4b7c-b268-85902d84afd1-original-pull-secret\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.364284 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.364269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3715a6cc-4533-4b7c-b268-85902d84afd1-dbus\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.465191 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465146 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j2f4\" (UniqueName: \"kubernetes.io/projected/9f5bdbe0-58a2-4835-a617-68a7443f80a6-kube-api-access-8j2f4\") pod \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " Apr 17 11:22:03.465191 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465203 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-trusted-ca-bundle\") pod \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " Apr 17 11:22:03.465483 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465222 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-service-ca\") pod \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " Apr 17 11:22:03.465483 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465239 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-oauth-serving-cert\") pod \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " Apr 17 11:22:03.465483 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465266 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-oauth-config\") pod \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " Apr 17 11:22:03.465483 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465429 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-serving-cert\") pod \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " Apr 17 11:22:03.465674 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465522 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-config\") pod \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\" (UID: \"9f5bdbe0-58a2-4835-a617-68a7443f80a6\") " Apr 17 11:22:03.465741 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3715a6cc-4533-4b7c-b268-85902d84afd1-kubelet-config\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.465741 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465716 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-service-ca" (OuterVolumeSpecName: "service-ca") pod "9f5bdbe0-58a2-4835-a617-68a7443f80a6" (UID: "9f5bdbe0-58a2-4835-a617-68a7443f80a6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:22:03.465741 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465726 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9f5bdbe0-58a2-4835-a617-68a7443f80a6" (UID: "9f5bdbe0-58a2-4835-a617-68a7443f80a6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:22:03.465879 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465733 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9f5bdbe0-58a2-4835-a617-68a7443f80a6" (UID: "9f5bdbe0-58a2-4835-a617-68a7443f80a6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:22:03.465879 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3715a6cc-4533-4b7c-b268-85902d84afd1-original-pull-secret\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.465879 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3715a6cc-4533-4b7c-b268-85902d84afd1-kubelet-config\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.466010 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3715a6cc-4533-4b7c-b268-85902d84afd1-dbus\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.466010 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465981 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-service-ca\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:03.466010 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.465998 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-oauth-serving-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:03.466152 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.466015 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-trusted-ca-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:03.466152 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.466057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3715a6cc-4533-4b7c-b268-85902d84afd1-dbus\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.466152 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.466071 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-config" (OuterVolumeSpecName: "console-config") pod "9f5bdbe0-58a2-4835-a617-68a7443f80a6" (UID: "9f5bdbe0-58a2-4835-a617-68a7443f80a6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:22:03.467872 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.467843 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5bdbe0-58a2-4835-a617-68a7443f80a6-kube-api-access-8j2f4" (OuterVolumeSpecName: "kube-api-access-8j2f4") pod "9f5bdbe0-58a2-4835-a617-68a7443f80a6" (UID: "9f5bdbe0-58a2-4835-a617-68a7443f80a6"). InnerVolumeSpecName "kube-api-access-8j2f4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:22:03.468121 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.468095 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9f5bdbe0-58a2-4835-a617-68a7443f80a6" (UID: "9f5bdbe0-58a2-4835-a617-68a7443f80a6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:22:03.468121 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.468112 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9f5bdbe0-58a2-4835-a617-68a7443f80a6" (UID: "9f5bdbe0-58a2-4835-a617-68a7443f80a6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:22:03.468279 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.468105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3715a6cc-4533-4b7c-b268-85902d84afd1-original-pull-secret\") pod \"global-pull-secret-syncer-87nfm\" (UID: \"3715a6cc-4533-4b7c-b268-85902d84afd1\") " pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.497393 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.497347 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87nfm" Apr 17 11:22:03.567093 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.567065 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8j2f4\" (UniqueName: \"kubernetes.io/projected/9f5bdbe0-58a2-4835-a617-68a7443f80a6-kube-api-access-8j2f4\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:03.567093 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.567093 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-oauth-config\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:03.567346 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.567103 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-serving-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:03.567346 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.567112 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f5bdbe0-58a2-4835-a617-68a7443f80a6-console-config\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:03.639398 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:03.639352 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-87nfm"] Apr 17 11:22:03.644538 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:22:03.644511 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3715a6cc_4533_4b7c_b268_85902d84afd1.slice/crio-7a1b61fae28720bdc69bc79a194a17515163617b9f4217c9f08fc1bf9a58778e WatchSource:0}: Error finding container 7a1b61fae28720bdc69bc79a194a17515163617b9f4217c9f08fc1bf9a58778e: Status 404 returned error can't find the container with id 7a1b61fae28720bdc69bc79a194a17515163617b9f4217c9f08fc1bf9a58778e Apr 17 11:22:04.103351 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.103307 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-87nfm" event={"ID":"3715a6cc-4533-4b7c-b268-85902d84afd1","Type":"ContainerStarted","Data":"7a1b61fae28720bdc69bc79a194a17515163617b9f4217c9f08fc1bf9a58778e"} Apr 17 11:22:04.104493 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.104475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b87dffd84-gnzrg_9f5bdbe0-58a2-4835-a617-68a7443f80a6/console/0.log" Apr 17 11:22:04.104600 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.104508 2577 generic.go:358] "Generic (PLEG): container finished" podID="9f5bdbe0-58a2-4835-a617-68a7443f80a6" containerID="7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714" exitCode=2 Apr 17 11:22:04.104600 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.104534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b87dffd84-gnzrg" event={"ID":"9f5bdbe0-58a2-4835-a617-68a7443f80a6","Type":"ContainerDied","Data":"7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714"} Apr 17 11:22:04.104600 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.104558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b87dffd84-gnzrg" event={"ID":"9f5bdbe0-58a2-4835-a617-68a7443f80a6","Type":"ContainerDied","Data":"51da4d2e39eab37a77b46e70f0186eae61a62370c69971f514e0badc84b9476a"} Apr 17 11:22:04.104600 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.104574 2577 scope.go:117] "RemoveContainer" containerID="7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714" Apr 17 11:22:04.104744 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.104603 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b87dffd84-gnzrg" Apr 17 11:22:04.112414 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.112202 2577 scope.go:117] "RemoveContainer" containerID="7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714" Apr 17 11:22:04.112774 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:04.112485 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714\": container with ID starting with 7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714 not found: ID does not exist" containerID="7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714" Apr 17 11:22:04.112774 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.112517 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714"} err="failed to get container status \"7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714\": rpc error: code = NotFound desc = could not find container \"7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714\": container with ID starting with 7e8674c7849083555b884b2fb56d751b3451938ca9197af151f599840d4f7714 not found: ID does not exist" Apr 17 11:22:04.120597 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.120560 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b87dffd84-gnzrg"] Apr 17 11:22:04.124615 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:04.124587 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b87dffd84-gnzrg"] Apr 17 11:22:06.065586 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:06.065551 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5bdbe0-58a2-4835-a617-68a7443f80a6" path="/var/lib/kubelet/pods/9f5bdbe0-58a2-4835-a617-68a7443f80a6/volumes" Apr 17 11:22:08.117592 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:08.117555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-87nfm" event={"ID":"3715a6cc-4533-4b7c-b268-85902d84afd1","Type":"ContainerStarted","Data":"1867b39bae3cf990c107bc2c2dbc74cb43b792de56ad1eb362794c65d6b40c35"} Apr 17 11:22:08.132551 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:08.132466 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-87nfm" podStartSLOduration=1.5490339130000002 podStartE2EDuration="5.132447583s" podCreationTimestamp="2026-04-17 11:22:03 +0000 UTC" firstStartedPulling="2026-04-17 11:22:03.645855184 +0000 UTC m=+352.183777394" lastFinishedPulling="2026-04-17 11:22:07.229268837 +0000 UTC m=+355.767191064" observedRunningTime="2026-04-17 11:22:08.131219071 +0000 UTC m=+356.669141302" watchObservedRunningTime="2026-04-17 11:22:08.132447583 +0000 UTC m=+356.670369817" Apr 17 11:22:22.439850 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.439809 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg"] Apr 17 11:22:22.440232 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.440109 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f5bdbe0-58a2-4835-a617-68a7443f80a6" containerName="console" Apr 17 11:22:22.440232 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.440122 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5bdbe0-58a2-4835-a617-68a7443f80a6" containerName="console" Apr 17 11:22:22.440232 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.440176 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f5bdbe0-58a2-4835-a617-68a7443f80a6" containerName="console" Apr 17 11:22:22.444584 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.444564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.446681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.446661 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8xcdc\"" Apr 17 11:22:22.446794 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.446660 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 11:22:22.447016 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.447002 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 11:22:22.451119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.451098 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg"] Apr 17 11:22:22.504914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.504875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.504914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.504918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zv9\" (UniqueName: \"kubernetes.io/projected/162f9679-9d0c-4190-ae1d-2f31a45a6944-kube-api-access-p6zv9\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.505106 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.505017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.605676 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.605642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.605844 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.605695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.605844 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.605717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6zv9\" (UniqueName: \"kubernetes.io/projected/162f9679-9d0c-4190-ae1d-2f31a45a6944-kube-api-access-p6zv9\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.606013 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.605993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.606045 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.606029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.613786 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.613763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6zv9\" (UniqueName: \"kubernetes.io/projected/162f9679-9d0c-4190-ae1d-2f31a45a6944-kube-api-access-p6zv9\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.753942 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.753891 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:22.873919 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:22.873887 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg"] Apr 17 11:22:22.877208 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:22:22.877183 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod162f9679_9d0c_4190_ae1d_2f31a45a6944.slice/crio-563a74a92df262fa077982ef40acbbdf18444f4d6129aabdc8ee6f669c35206c WatchSource:0}: Error finding container 563a74a92df262fa077982ef40acbbdf18444f4d6129aabdc8ee6f669c35206c: Status 404 returned error can't find the container with id 563a74a92df262fa077982ef40acbbdf18444f4d6129aabdc8ee6f669c35206c Apr 17 11:22:23.159879 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:23.159801 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" event={"ID":"162f9679-9d0c-4190-ae1d-2f31a45a6944","Type":"ContainerStarted","Data":"563a74a92df262fa077982ef40acbbdf18444f4d6129aabdc8ee6f669c35206c"} Apr 17 11:22:28.174615 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:28.174521 2577 generic.go:358] "Generic (PLEG): container finished" podID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerID="75f531b19007cb9891cd6ece33192e9154cb25b1ca9cce8a1384e9b2086e90c6" exitCode=0 Apr 17 11:22:28.174615 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:28.174600 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" event={"ID":"162f9679-9d0c-4190-ae1d-2f31a45a6944","Type":"ContainerDied","Data":"75f531b19007cb9891cd6ece33192e9154cb25b1ca9cce8a1384e9b2086e90c6"} Apr 17 11:22:30.180880 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:30.180853 2577 generic.go:358] "Generic (PLEG): container finished" podID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerID="a65a2f8d1cb79e9979e63bd65cb98698f4e67d46e24ba8f6a7fa3fbe55ab58cc" exitCode=0 Apr 17 11:22:30.181242 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:30.180931 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" event={"ID":"162f9679-9d0c-4190-ae1d-2f31a45a6944","Type":"ContainerDied","Data":"a65a2f8d1cb79e9979e63bd65cb98698f4e67d46e24ba8f6a7fa3fbe55ab58cc"} Apr 17 11:22:37.204781 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:37.204749 2577 generic.go:358] "Generic (PLEG): container finished" podID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerID="869d33dec37267f5b7b15f221523015d9e6fbcc3e800a4a684c9a26d77bf8362" exitCode=0 Apr 17 11:22:37.205178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:37.204821 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" event={"ID":"162f9679-9d0c-4190-ae1d-2f31a45a6944","Type":"ContainerDied","Data":"869d33dec37267f5b7b15f221523015d9e6fbcc3e800a4a684c9a26d77bf8362"} Apr 17 11:22:38.325794 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.325770 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:38.424494 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.424453 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6zv9\" (UniqueName: \"kubernetes.io/projected/162f9679-9d0c-4190-ae1d-2f31a45a6944-kube-api-access-p6zv9\") pod \"162f9679-9d0c-4190-ae1d-2f31a45a6944\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " Apr 17 11:22:38.424682 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.424536 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-util\") pod \"162f9679-9d0c-4190-ae1d-2f31a45a6944\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " Apr 17 11:22:38.424682 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.424607 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-bundle\") pod \"162f9679-9d0c-4190-ae1d-2f31a45a6944\" (UID: \"162f9679-9d0c-4190-ae1d-2f31a45a6944\") " Apr 17 11:22:38.425176 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.425139 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-bundle" (OuterVolumeSpecName: "bundle") pod "162f9679-9d0c-4190-ae1d-2f31a45a6944" (UID: "162f9679-9d0c-4190-ae1d-2f31a45a6944"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:22:38.426924 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.426905 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162f9679-9d0c-4190-ae1d-2f31a45a6944-kube-api-access-p6zv9" (OuterVolumeSpecName: "kube-api-access-p6zv9") pod "162f9679-9d0c-4190-ae1d-2f31a45a6944" (UID: "162f9679-9d0c-4190-ae1d-2f31a45a6944"). InnerVolumeSpecName "kube-api-access-p6zv9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:22:38.428416 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.428395 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-util" (OuterVolumeSpecName: "util") pod "162f9679-9d0c-4190-ae1d-2f31a45a6944" (UID: "162f9679-9d0c-4190-ae1d-2f31a45a6944"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:22:38.525947 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.525851 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:38.525947 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.525885 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/162f9679-9d0c-4190-ae1d-2f31a45a6944-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:38.525947 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:38.525898 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6zv9\" (UniqueName: \"kubernetes.io/projected/162f9679-9d0c-4190-ae1d-2f31a45a6944-kube-api-access-p6zv9\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:22:39.212573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:39.212533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" event={"ID":"162f9679-9d0c-4190-ae1d-2f31a45a6944","Type":"ContainerDied","Data":"563a74a92df262fa077982ef40acbbdf18444f4d6129aabdc8ee6f669c35206c"} Apr 17 11:22:39.212573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:39.212569 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563a74a92df262fa077982ef40acbbdf18444f4d6129aabdc8ee6f669c35206c" Apr 17 11:22:39.212777 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:39.212592 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqmsbg" Apr 17 11:22:49.680841 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.680804 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2"] Apr 17 11:22:49.681239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.681082 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerName="pull" Apr 17 11:22:49.681239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.681093 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerName="pull" Apr 17 11:22:49.681239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.681103 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerName="util" Apr 17 11:22:49.681239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.681109 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerName="util" Apr 17 11:22:49.681239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.681123 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerName="extract" Apr 17 11:22:49.681239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.681129 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerName="extract" Apr 17 11:22:49.681239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.681173 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="162f9679-9d0c-4190-ae1d-2f31a45a6944" containerName="extract" Apr 17 11:22:49.688029 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.688010 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:49.690416 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.690390 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 11:22:49.690764 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.690746 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 11:22:49.690891 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.690802 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 11:22:49.690990 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.690972 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-2d597\"" Apr 17 11:22:49.691054 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.691044 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 11:22:49.691317 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.691304 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 11:22:49.696781 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.696756 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2"] Apr 17 11:22:49.807839 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.807801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhfmw\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-kube-api-access-lhfmw\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:49.807839 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.807844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:49.808043 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.807953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/46e75b93-8d7b-45c1-ab81-7add75b9dc05-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:49.908996 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.908964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhfmw\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-kube-api-access-lhfmw\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:49.909183 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.909005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:49.909183 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.909053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/46e75b93-8d7b-45c1-ab81-7add75b9dc05-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:49.909295 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:49.909181 2577 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:22:49.909295 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:49.909201 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:22:49.909295 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:49.909219 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2: references non-existent secret key: tls.crt Apr 17 11:22:49.909295 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:49.909280 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates podName:46e75b93-8d7b-45c1-ab81-7add75b9dc05 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:50.409257087 +0000 UTC m=+398.947179300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates") pod "keda-metrics-apiserver-7c9f485588-d7kw2" (UID: "46e75b93-8d7b-45c1-ab81-7add75b9dc05") : references non-existent secret key: tls.crt Apr 17 11:22:49.909490 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.909398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/46e75b93-8d7b-45c1-ab81-7add75b9dc05-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:49.918042 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:49.918008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhfmw\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-kube-api-access-lhfmw\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:50.020832 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.020794 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-m5c6f"] Apr 17 11:22:50.023971 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.023954 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:50.027243 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.027221 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 11:22:50.038859 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.038828 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-m5c6f"] Apr 17 11:22:50.111297 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.111264 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6vpj\" (UniqueName: \"kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-kube-api-access-f6vpj\") pod \"keda-admission-cf49989db-m5c6f\" (UID: \"980ac293-4f7b-4f1e-8cce-d12e05c2a958\") " pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:50.111502 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.111382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-certificates\") pod \"keda-admission-cf49989db-m5c6f\" (UID: \"980ac293-4f7b-4f1e-8cce-d12e05c2a958\") " pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:50.211963 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.211926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-certificates\") pod \"keda-admission-cf49989db-m5c6f\" (UID: \"980ac293-4f7b-4f1e-8cce-d12e05c2a958\") " pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:50.211963 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.211970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6vpj\" (UniqueName: \"kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-kube-api-access-f6vpj\") pod \"keda-admission-cf49989db-m5c6f\" (UID: \"980ac293-4f7b-4f1e-8cce-d12e05c2a958\") " pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:50.212188 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:50.212089 2577 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 17 11:22:50.212188 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:50.212115 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-m5c6f: secret "keda-admission-webhooks-certs" not found Apr 17 11:22:50.212188 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:50.212188 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-certificates podName:980ac293-4f7b-4f1e-8cce-d12e05c2a958 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:50.712169897 +0000 UTC m=+399.250092121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-certificates") pod "keda-admission-cf49989db-m5c6f" (UID: "980ac293-4f7b-4f1e-8cce-d12e05c2a958") : secret "keda-admission-webhooks-certs" not found Apr 17 11:22:50.220487 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.220461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6vpj\" (UniqueName: \"kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-kube-api-access-f6vpj\") pod \"keda-admission-cf49989db-m5c6f\" (UID: \"980ac293-4f7b-4f1e-8cce-d12e05c2a958\") " pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:50.413983 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.413865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:50.414147 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:50.414034 2577 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:22:50.414147 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:50.414053 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:22:50.414147 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:50.414073 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2: references non-existent secret key: tls.crt Apr 17 11:22:50.414147 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:50.414138 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates podName:46e75b93-8d7b-45c1-ab81-7add75b9dc05 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:51.414121988 +0000 UTC m=+399.952044198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates") pod "keda-metrics-apiserver-7c9f485588-d7kw2" (UID: "46e75b93-8d7b-45c1-ab81-7add75b9dc05") : references non-existent secret key: tls.crt Apr 17 11:22:50.717357 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.717325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-certificates\") pod \"keda-admission-cf49989db-m5c6f\" (UID: \"980ac293-4f7b-4f1e-8cce-d12e05c2a958\") " pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:50.720003 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.719975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/980ac293-4f7b-4f1e-8cce-d12e05c2a958-certificates\") pod \"keda-admission-cf49989db-m5c6f\" (UID: \"980ac293-4f7b-4f1e-8cce-d12e05c2a958\") " pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:50.934705 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:50.934656 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:51.059999 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:51.059964 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-m5c6f"] Apr 17 11:22:51.063102 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:22:51.063072 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980ac293_4f7b_4f1e_8cce_d12e05c2a958.slice/crio-5c75c728a86a59e91082b3789441bbba9b1756cd26ef5667eff928d0d33e655e WatchSource:0}: Error finding container 5c75c728a86a59e91082b3789441bbba9b1756cd26ef5667eff928d0d33e655e: Status 404 returned error can't find the container with id 5c75c728a86a59e91082b3789441bbba9b1756cd26ef5667eff928d0d33e655e Apr 17 11:22:51.244539 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:51.244451 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-m5c6f" event={"ID":"980ac293-4f7b-4f1e-8cce-d12e05c2a958","Type":"ContainerStarted","Data":"5c75c728a86a59e91082b3789441bbba9b1756cd26ef5667eff928d0d33e655e"} Apr 17 11:22:51.424762 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:51.424717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:51.424911 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:51.424863 2577 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:22:51.424911 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:51.424883 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:22:51.424911 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:51.424904 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2: references non-existent secret key: tls.crt Apr 17 11:22:51.425046 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:22:51.424957 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates podName:46e75b93-8d7b-45c1-ab81-7add75b9dc05 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:53.424942688 +0000 UTC m=+401.962864899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates") pod "keda-metrics-apiserver-7c9f485588-d7kw2" (UID: "46e75b93-8d7b-45c1-ab81-7add75b9dc05") : references non-existent secret key: tls.crt Apr 17 11:22:53.251570 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:53.251529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-m5c6f" event={"ID":"980ac293-4f7b-4f1e-8cce-d12e05c2a958","Type":"ContainerStarted","Data":"9e721856db34a2b717fdc4236db5c1ea858fbec34c9adb685affdfe816a25017"} Apr 17 11:22:53.251996 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:53.251691 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:22:53.269381 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:53.269312 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-m5c6f" podStartSLOduration=2.956773202 podStartE2EDuration="4.26929738s" podCreationTimestamp="2026-04-17 11:22:49 +0000 UTC" firstStartedPulling="2026-04-17 11:22:51.064523345 +0000 UTC m=+399.602445555" lastFinishedPulling="2026-04-17 11:22:52.377047524 +0000 UTC m=+400.914969733" observedRunningTime="2026-04-17 11:22:53.268447479 +0000 UTC m=+401.806369715" watchObservedRunningTime="2026-04-17 11:22:53.26929738 +0000 UTC m=+401.807219629" Apr 17 11:22:53.441624 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:53.441571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:53.444225 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:53.444202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46e75b93-8d7b-45c1-ab81-7add75b9dc05-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d7kw2\" (UID: \"46e75b93-8d7b-45c1-ab81-7add75b9dc05\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:53.598975 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:53.598890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:53.718193 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:53.718160 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2"] Apr 17 11:22:53.721321 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:22:53.721293 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46e75b93_8d7b_45c1_ab81_7add75b9dc05.slice/crio-5c256e24e0bbc20cd79b6bea4e9dde7ba0ecf3913e38e7d9dbaee561641d5847 WatchSource:0}: Error finding container 5c256e24e0bbc20cd79b6bea4e9dde7ba0ecf3913e38e7d9dbaee561641d5847: Status 404 returned error can't find the container with id 5c256e24e0bbc20cd79b6bea4e9dde7ba0ecf3913e38e7d9dbaee561641d5847 Apr 17 11:22:54.255602 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:54.255561 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" event={"ID":"46e75b93-8d7b-45c1-ab81-7add75b9dc05","Type":"ContainerStarted","Data":"5c256e24e0bbc20cd79b6bea4e9dde7ba0ecf3913e38e7d9dbaee561641d5847"} Apr 17 11:22:56.262903 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:56.262868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" event={"ID":"46e75b93-8d7b-45c1-ab81-7add75b9dc05","Type":"ContainerStarted","Data":"2d9499c5d2c1d53f80a494ae2aa11d5c0b5a18ee55bbeb13dd3eeb0614bfc88d"} Apr 17 11:22:56.263290 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:56.263036 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:22:56.285646 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:22:56.285588 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" podStartSLOduration=4.926471543 podStartE2EDuration="7.285570182s" podCreationTimestamp="2026-04-17 11:22:49 +0000 UTC" firstStartedPulling="2026-04-17 11:22:53.722701032 +0000 UTC m=+402.260623242" lastFinishedPulling="2026-04-17 11:22:56.081799671 +0000 UTC m=+404.619721881" observedRunningTime="2026-04-17 11:22:56.284605583 +0000 UTC m=+404.822527831" watchObservedRunningTime="2026-04-17 11:22:56.285570182 +0000 UTC m=+404.823492417" Apr 17 11:23:07.269998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:07.269969 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d7kw2" Apr 17 11:23:14.258196 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:14.258167 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-m5c6f" Apr 17 11:23:42.503062 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.502986 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk"] Apr 17 11:23:42.511242 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.511221 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.513554 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.513534 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 11:23:42.513694 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.513563 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8xcdc\"" Apr 17 11:23:42.514179 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.514137 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 11:23:42.514288 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.514267 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk"] Apr 17 11:23:42.517590 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.517565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.517714 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.517610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpvj\" (UniqueName: \"kubernetes.io/projected/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-kube-api-access-4wpvj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.517771 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.517746 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.618400 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.618344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.618574 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.618407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpvj\" (UniqueName: \"kubernetes.io/projected/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-kube-api-access-4wpvj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.618574 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.618472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.618740 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.618720 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.618775 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.618744 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.626716 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.626695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wpvj\" (UniqueName: \"kubernetes.io/projected/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-kube-api-access-4wpvj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.821208 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.821131 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:42.939932 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:42.939898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk"] Apr 17 11:23:42.943111 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:23:42.943085 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3b49ef_a886_40b6_99f4_d4fd722a99ff.slice/crio-21746aa911eb9b6c49b432400d8786698e6655c815f4bccbe6624ddb6df28111 WatchSource:0}: Error finding container 21746aa911eb9b6c49b432400d8786698e6655c815f4bccbe6624ddb6df28111: Status 404 returned error can't find the container with id 21746aa911eb9b6c49b432400d8786698e6655c815f4bccbe6624ddb6df28111 Apr 17 11:23:43.394385 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:43.394335 2577 generic.go:358] "Generic (PLEG): container finished" podID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerID="c3fc3a3e18ac64b0cb83b4bb5380f781aa537c448cf9ad2eac320380a34db459" exitCode=0 Apr 17 11:23:43.394692 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:43.394391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" event={"ID":"4e3b49ef-a886-40b6-99f4-d4fd722a99ff","Type":"ContainerDied","Data":"c3fc3a3e18ac64b0cb83b4bb5380f781aa537c448cf9ad2eac320380a34db459"} Apr 17 11:23:43.394692 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:43.394428 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" event={"ID":"4e3b49ef-a886-40b6-99f4-d4fd722a99ff","Type":"ContainerStarted","Data":"21746aa911eb9b6c49b432400d8786698e6655c815f4bccbe6624ddb6df28111"} Apr 17 11:23:44.399438 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:44.399332 2577 generic.go:358] "Generic (PLEG): container finished" podID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerID="16da2c00ef7f67e6bfd7a2df95b02a37fc90e60fc51afc7a6d00befa1038f4d5" exitCode=0 Apr 17 11:23:44.399438 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:44.399410 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" event={"ID":"4e3b49ef-a886-40b6-99f4-d4fd722a99ff","Type":"ContainerDied","Data":"16da2c00ef7f67e6bfd7a2df95b02a37fc90e60fc51afc7a6d00befa1038f4d5"} Apr 17 11:23:45.409356 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:45.409324 2577 generic.go:358] "Generic (PLEG): container finished" podID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerID="3bf921f734bba6f54cc14b9cdde331cf4ae078512ed5ac9c8a3bdfb0910e7e5b" exitCode=0 Apr 17 11:23:45.409736 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:45.409400 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" event={"ID":"4e3b49ef-a886-40b6-99f4-d4fd722a99ff","Type":"ContainerDied","Data":"3bf921f734bba6f54cc14b9cdde331cf4ae078512ed5ac9c8a3bdfb0910e7e5b"} Apr 17 11:23:46.542085 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.542058 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:46.646116 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.646076 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wpvj\" (UniqueName: \"kubernetes.io/projected/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-kube-api-access-4wpvj\") pod \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " Apr 17 11:23:46.646317 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.646179 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-util\") pod \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " Apr 17 11:23:46.646317 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.646216 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-bundle\") pod \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\" (UID: \"4e3b49ef-a886-40b6-99f4-d4fd722a99ff\") " Apr 17 11:23:46.646879 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.646852 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-bundle" (OuterVolumeSpecName: "bundle") pod "4e3b49ef-a886-40b6-99f4-d4fd722a99ff" (UID: "4e3b49ef-a886-40b6-99f4-d4fd722a99ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:23:46.648580 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.648552 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-kube-api-access-4wpvj" (OuterVolumeSpecName: "kube-api-access-4wpvj") pod "4e3b49ef-a886-40b6-99f4-d4fd722a99ff" (UID: "4e3b49ef-a886-40b6-99f4-d4fd722a99ff"). InnerVolumeSpecName "kube-api-access-4wpvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:23:46.652233 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.652209 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-util" (OuterVolumeSpecName: "util") pod "4e3b49ef-a886-40b6-99f4-d4fd722a99ff" (UID: "4e3b49ef-a886-40b6-99f4-d4fd722a99ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:23:46.747105 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.747072 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:23:46.747105 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.747106 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:23:46.747293 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:46.747115 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4wpvj\" (UniqueName: \"kubernetes.io/projected/4e3b49ef-a886-40b6-99f4-d4fd722a99ff-kube-api-access-4wpvj\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:23:47.417964 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:47.417934 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" Apr 17 11:23:47.418174 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:47.417934 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s77vk" event={"ID":"4e3b49ef-a886-40b6-99f4-d4fd722a99ff","Type":"ContainerDied","Data":"21746aa911eb9b6c49b432400d8786698e6655c815f4bccbe6624ddb6df28111"} Apr 17 11:23:47.418174 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:47.418049 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21746aa911eb9b6c49b432400d8786698e6655c815f4bccbe6624ddb6df28111" Apr 17 11:23:59.068502 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.068464 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj"] Apr 17 11:23:59.068910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.068753 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerName="pull" Apr 17 11:23:59.068910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.068764 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerName="pull" Apr 17 11:23:59.068910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.068778 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerName="util" Apr 17 11:23:59.068910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.068783 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerName="util" Apr 17 11:23:59.068910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.068789 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerName="extract" Apr 17 11:23:59.068910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.068794 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerName="extract" Apr 17 11:23:59.068910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.068843 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e3b49ef-a886-40b6-99f4-d4fd722a99ff" containerName="extract" Apr 17 11:23:59.073165 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.073145 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.075351 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.075331 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 11:23:59.075818 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.075798 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8xcdc\"" Apr 17 11:23:59.075818 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.075801 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 11:23:59.085956 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.085933 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj"] Apr 17 11:23:59.145442 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.145404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ddb\" (UniqueName: \"kubernetes.io/projected/d915f58e-95c9-4f7b-b344-41ae75084e6f-kube-api-access-47ddb\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.145638 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.145454 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.145638 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.145543 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.245912 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.245875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.246074 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.245945 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47ddb\" (UniqueName: \"kubernetes.io/projected/d915f58e-95c9-4f7b-b344-41ae75084e6f-kube-api-access-47ddb\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.246074 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.245992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.246284 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.246263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.246318 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.246273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.290619 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.290584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ddb\" (UniqueName: \"kubernetes.io/projected/d915f58e-95c9-4f7b-b344-41ae75084e6f-kube-api-access-47ddb\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.382627 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.382522 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:23:59.513257 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:23:59.513232 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj"] Apr 17 11:23:59.516109 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:23:59.516081 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd915f58e_95c9_4f7b_b344_41ae75084e6f.slice/crio-d6c84e4e02cecac1e0be04a2ed493d7034584c41f4f66ef756515c56a0c44b28 WatchSource:0}: Error finding container d6c84e4e02cecac1e0be04a2ed493d7034584c41f4f66ef756515c56a0c44b28: Status 404 returned error can't find the container with id d6c84e4e02cecac1e0be04a2ed493d7034584c41f4f66ef756515c56a0c44b28 Apr 17 11:24:00.454432 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:00.454401 2577 generic.go:358] "Generic (PLEG): container finished" podID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerID="5db7c3f14a444409e65a298125f1f3a5efcad515691d1a7425d3c6fcaeb3b6c9" exitCode=0 Apr 17 11:24:00.454790 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:00.454442 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" event={"ID":"d915f58e-95c9-4f7b-b344-41ae75084e6f","Type":"ContainerDied","Data":"5db7c3f14a444409e65a298125f1f3a5efcad515691d1a7425d3c6fcaeb3b6c9"} Apr 17 11:24:00.454790 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:00.454463 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" event={"ID":"d915f58e-95c9-4f7b-b344-41ae75084e6f","Type":"ContainerStarted","Data":"d6c84e4e02cecac1e0be04a2ed493d7034584c41f4f66ef756515c56a0c44b28"} Apr 17 11:24:03.464752 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:03.464719 2577 generic.go:358] "Generic (PLEG): container finished" podID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerID="0326fb11a41dcdc9efd4e55b5acd4c8f9d68dae2bb4ee4cb4a7b87620d9efd1c" exitCode=0 Apr 17 11:24:03.465217 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:03.464788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" event={"ID":"d915f58e-95c9-4f7b-b344-41ae75084e6f","Type":"ContainerDied","Data":"0326fb11a41dcdc9efd4e55b5acd4c8f9d68dae2bb4ee4cb4a7b87620d9efd1c"} Apr 17 11:24:04.469568 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:04.469538 2577 generic.go:358] "Generic (PLEG): container finished" podID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerID="5e8f5a88ca76610d0b1af0a0c30785a7ac7d3ddf641afb0e0b8ca6b12863e808" exitCode=0 Apr 17 11:24:04.469941 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:04.469611 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" event={"ID":"d915f58e-95c9-4f7b-b344-41ae75084e6f","Type":"ContainerDied","Data":"5e8f5a88ca76610d0b1af0a0c30785a7ac7d3ddf641afb0e0b8ca6b12863e808"} Apr 17 11:24:05.596884 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.596860 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:24:05.697604 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.697570 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-util\") pod \"d915f58e-95c9-4f7b-b344-41ae75084e6f\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " Apr 17 11:24:05.697604 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.697614 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47ddb\" (UniqueName: \"kubernetes.io/projected/d915f58e-95c9-4f7b-b344-41ae75084e6f-kube-api-access-47ddb\") pod \"d915f58e-95c9-4f7b-b344-41ae75084e6f\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " Apr 17 11:24:05.697842 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.697666 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-bundle\") pod \"d915f58e-95c9-4f7b-b344-41ae75084e6f\" (UID: \"d915f58e-95c9-4f7b-b344-41ae75084e6f\") " Apr 17 11:24:05.698069 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.698040 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-bundle" (OuterVolumeSpecName: "bundle") pod "d915f58e-95c9-4f7b-b344-41ae75084e6f" (UID: "d915f58e-95c9-4f7b-b344-41ae75084e6f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:24:05.699956 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.699933 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d915f58e-95c9-4f7b-b344-41ae75084e6f-kube-api-access-47ddb" (OuterVolumeSpecName: "kube-api-access-47ddb") pod "d915f58e-95c9-4f7b-b344-41ae75084e6f" (UID: "d915f58e-95c9-4f7b-b344-41ae75084e6f"). InnerVolumeSpecName "kube-api-access-47ddb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:24:05.703914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.703884 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-util" (OuterVolumeSpecName: "util") pod "d915f58e-95c9-4f7b-b344-41ae75084e6f" (UID: "d915f58e-95c9-4f7b-b344-41ae75084e6f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:24:05.798560 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.798470 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:05.798560 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.798503 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47ddb\" (UniqueName: \"kubernetes.io/projected/d915f58e-95c9-4f7b-b344-41ae75084e6f-kube-api-access-47ddb\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:05.798560 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:05.798514 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d915f58e-95c9-4f7b-b344-41ae75084e6f-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:06.477618 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:06.477587 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" Apr 17 11:24:06.477618 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:06.477601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvlwj" event={"ID":"d915f58e-95c9-4f7b-b344-41ae75084e6f","Type":"ContainerDied","Data":"d6c84e4e02cecac1e0be04a2ed493d7034584c41f4f66ef756515c56a0c44b28"} Apr 17 11:24:06.477816 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:06.477631 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c84e4e02cecac1e0be04a2ed493d7034584c41f4f66ef756515c56a0c44b28" Apr 17 11:24:09.455073 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.455037 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-ntdfl"] Apr 17 11:24:09.455435 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.455301 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerName="extract" Apr 17 11:24:09.455435 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.455312 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerName="extract" Apr 17 11:24:09.455435 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.455331 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerName="pull" Apr 17 11:24:09.455435 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.455336 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerName="pull" Apr 17 11:24:09.455435 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.455343 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerName="util" Apr 17 11:24:09.455435 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.455349 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerName="util" Apr 17 11:24:09.455435 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.455413 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d915f58e-95c9-4f7b-b344-41ae75084e6f" containerName="extract" Apr 17 11:24:09.461085 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.461069 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-ntdfl" Apr 17 11:24:09.463323 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.463305 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 11:24:09.463778 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.463761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-7pgjv\"" Apr 17 11:24:09.463829 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.463795 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 11:24:09.467115 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.467094 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-ntdfl"] Apr 17 11:24:09.523825 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.523785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btbgs\" (UniqueName: \"kubernetes.io/projected/26cd2816-7240-4b22-89f6-bb86085cc67b-kube-api-access-btbgs\") pod \"cert-manager-759f64656b-ntdfl\" (UID: \"26cd2816-7240-4b22-89f6-bb86085cc67b\") " pod="cert-manager/cert-manager-759f64656b-ntdfl" Apr 17 11:24:09.523825 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.523823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26cd2816-7240-4b22-89f6-bb86085cc67b-bound-sa-token\") pod \"cert-manager-759f64656b-ntdfl\" (UID: \"26cd2816-7240-4b22-89f6-bb86085cc67b\") " pod="cert-manager/cert-manager-759f64656b-ntdfl" Apr 17 11:24:09.624648 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.624607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btbgs\" (UniqueName: \"kubernetes.io/projected/26cd2816-7240-4b22-89f6-bb86085cc67b-kube-api-access-btbgs\") pod \"cert-manager-759f64656b-ntdfl\" (UID: \"26cd2816-7240-4b22-89f6-bb86085cc67b\") " pod="cert-manager/cert-manager-759f64656b-ntdfl" Apr 17 11:24:09.624648 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.624649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26cd2816-7240-4b22-89f6-bb86085cc67b-bound-sa-token\") pod \"cert-manager-759f64656b-ntdfl\" (UID: \"26cd2816-7240-4b22-89f6-bb86085cc67b\") " pod="cert-manager/cert-manager-759f64656b-ntdfl" Apr 17 11:24:09.632507 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.632480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26cd2816-7240-4b22-89f6-bb86085cc67b-bound-sa-token\") pod \"cert-manager-759f64656b-ntdfl\" (UID: \"26cd2816-7240-4b22-89f6-bb86085cc67b\") " pod="cert-manager/cert-manager-759f64656b-ntdfl" Apr 17 11:24:09.632641 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.632548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btbgs\" (UniqueName: \"kubernetes.io/projected/26cd2816-7240-4b22-89f6-bb86085cc67b-kube-api-access-btbgs\") pod \"cert-manager-759f64656b-ntdfl\" (UID: \"26cd2816-7240-4b22-89f6-bb86085cc67b\") " pod="cert-manager/cert-manager-759f64656b-ntdfl" Apr 17 11:24:09.770736 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.770657 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-ntdfl" Apr 17 11:24:09.889126 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:09.889095 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-ntdfl"] Apr 17 11:24:09.891641 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:24:09.891612 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26cd2816_7240_4b22_89f6_bb86085cc67b.slice/crio-792bb6f5cdf43dad8e58aa026932e857dfddd3c7a6ffe683aabe9f2c3bd3dde6 WatchSource:0}: Error finding container 792bb6f5cdf43dad8e58aa026932e857dfddd3c7a6ffe683aabe9f2c3bd3dde6: Status 404 returned error can't find the container with id 792bb6f5cdf43dad8e58aa026932e857dfddd3c7a6ffe683aabe9f2c3bd3dde6 Apr 17 11:24:10.493376 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:10.493318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-ntdfl" event={"ID":"26cd2816-7240-4b22-89f6-bb86085cc67b","Type":"ContainerStarted","Data":"792bb6f5cdf43dad8e58aa026932e857dfddd3c7a6ffe683aabe9f2c3bd3dde6"} Apr 17 11:24:13.504285 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:13.504248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-ntdfl" event={"ID":"26cd2816-7240-4b22-89f6-bb86085cc67b","Type":"ContainerStarted","Data":"85a510086341351e585b0bf4416942d02754787a679bf439e629cc2acf782a14"} Apr 17 11:24:13.521432 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:13.521359 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-ntdfl" podStartSLOduration=1.943538669 podStartE2EDuration="4.521346318s" podCreationTimestamp="2026-04-17 11:24:09 +0000 UTC" firstStartedPulling="2026-04-17 11:24:09.893537345 +0000 UTC m=+478.431459568" lastFinishedPulling="2026-04-17 11:24:12.471345005 +0000 UTC m=+481.009267217" observedRunningTime="2026-04-17 11:24:13.520720081 +0000 UTC m=+482.058642313" watchObservedRunningTime="2026-04-17 11:24:13.521346318 +0000 UTC m=+482.059268550" Apr 17 11:24:29.413251 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.413212 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj"] Apr 17 11:24:29.418622 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.418603 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.421166 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.421144 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 11:24:29.421284 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.421186 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 11:24:29.421624 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.421594 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8xcdc\"" Apr 17 11:24:29.423815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.423792 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj"] Apr 17 11:24:29.486259 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.486223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgz28\" (UniqueName: \"kubernetes.io/projected/45fbae17-37f2-4f12-bd99-9e5f1807ef42-kube-api-access-pgz28\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.486439 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.486279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.486439 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.486303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.586841 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.586807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgz28\" (UniqueName: \"kubernetes.io/projected/45fbae17-37f2-4f12-bd99-9e5f1807ef42-kube-api-access-pgz28\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.587014 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.586847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.587014 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.586868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.587208 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.587193 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.587256 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.587234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.598322 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.598287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgz28\" (UniqueName: \"kubernetes.io/projected/45fbae17-37f2-4f12-bd99-9e5f1807ef42-kube-api-access-pgz28\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.728445 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.728401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:29.846790 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:29.846758 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj"] Apr 17 11:24:29.849718 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:24:29.849696 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45fbae17_37f2_4f12_bd99_9e5f1807ef42.slice/crio-4e67ce1699560543f8543c543bea68c41b9a4dc18da95ba5bd7f67ea5c549726 WatchSource:0}: Error finding container 4e67ce1699560543f8543c543bea68c41b9a4dc18da95ba5bd7f67ea5c549726: Status 404 returned error can't find the container with id 4e67ce1699560543f8543c543bea68c41b9a4dc18da95ba5bd7f67ea5c549726 Apr 17 11:24:30.553643 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:30.553606 2577 generic.go:358] "Generic (PLEG): container finished" podID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerID="725199adaffaff67aada77d91fb15aa12297784e9566eecc35e438e3e4bd5b15" exitCode=0 Apr 17 11:24:30.553988 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:30.553666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" event={"ID":"45fbae17-37f2-4f12-bd99-9e5f1807ef42","Type":"ContainerDied","Data":"725199adaffaff67aada77d91fb15aa12297784e9566eecc35e438e3e4bd5b15"} Apr 17 11:24:30.553988 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:30.553694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" event={"ID":"45fbae17-37f2-4f12-bd99-9e5f1807ef42","Type":"ContainerStarted","Data":"4e67ce1699560543f8543c543bea68c41b9a4dc18da95ba5bd7f67ea5c549726"} Apr 17 11:24:31.557731 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:31.557691 2577 generic.go:358] "Generic (PLEG): container finished" podID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerID="67a0ae5fc14b85cfa60fb1e36e25a4147146a95c6b57886aa50a54e53abf3814" exitCode=0 Apr 17 11:24:31.558140 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:31.557743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" event={"ID":"45fbae17-37f2-4f12-bd99-9e5f1807ef42","Type":"ContainerDied","Data":"67a0ae5fc14b85cfa60fb1e36e25a4147146a95c6b57886aa50a54e53abf3814"} Apr 17 11:24:32.561866 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:32.561832 2577 generic.go:358] "Generic (PLEG): container finished" podID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerID="2fc3f9be86f6db67f9053d2fa6bbed0e693dce4d4c17730fd1c88d4c3afd8b2f" exitCode=0 Apr 17 11:24:32.562239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:32.561905 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" event={"ID":"45fbae17-37f2-4f12-bd99-9e5f1807ef42","Type":"ContainerDied","Data":"2fc3f9be86f6db67f9053d2fa6bbed0e693dce4d4c17730fd1c88d4c3afd8b2f"} Apr 17 11:24:33.687275 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.687251 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:33.819393 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.819273 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgz28\" (UniqueName: \"kubernetes.io/projected/45fbae17-37f2-4f12-bd99-9e5f1807ef42-kube-api-access-pgz28\") pod \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " Apr 17 11:24:33.819393 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.819335 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-bundle\") pod \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " Apr 17 11:24:33.819638 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.819425 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-util\") pod \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\" (UID: \"45fbae17-37f2-4f12-bd99-9e5f1807ef42\") " Apr 17 11:24:33.820262 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.820231 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-bundle" (OuterVolumeSpecName: "bundle") pod "45fbae17-37f2-4f12-bd99-9e5f1807ef42" (UID: "45fbae17-37f2-4f12-bd99-9e5f1807ef42"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:24:33.821493 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.821467 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fbae17-37f2-4f12-bd99-9e5f1807ef42-kube-api-access-pgz28" (OuterVolumeSpecName: "kube-api-access-pgz28") pod "45fbae17-37f2-4f12-bd99-9e5f1807ef42" (UID: "45fbae17-37f2-4f12-bd99-9e5f1807ef42"). InnerVolumeSpecName "kube-api-access-pgz28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:24:33.824857 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.824830 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-util" (OuterVolumeSpecName: "util") pod "45fbae17-37f2-4f12-bd99-9e5f1807ef42" (UID: "45fbae17-37f2-4f12-bd99-9e5f1807ef42"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:24:33.920257 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.920223 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgz28\" (UniqueName: \"kubernetes.io/projected/45fbae17-37f2-4f12-bd99-9e5f1807ef42-kube-api-access-pgz28\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:33.920257 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.920254 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:33.920490 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:33.920267 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45fbae17-37f2-4f12-bd99-9e5f1807ef42-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:34.570148 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:34.570121 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" Apr 17 11:24:34.570309 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:34.570119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kg2dj" event={"ID":"45fbae17-37f2-4f12-bd99-9e5f1807ef42","Type":"ContainerDied","Data":"4e67ce1699560543f8543c543bea68c41b9a4dc18da95ba5bd7f67ea5c549726"} Apr 17 11:24:34.570309 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:34.570223 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e67ce1699560543f8543c543bea68c41b9a4dc18da95ba5bd7f67ea5c549726" Apr 17 11:24:39.542755 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.542725 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p"] Apr 17 11:24:39.543119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.542982 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerName="pull" Apr 17 11:24:39.543119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.542992 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerName="pull" Apr 17 11:24:39.543119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.543001 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerName="util" Apr 17 11:24:39.543119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.543006 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerName="util" Apr 17 11:24:39.543119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.543017 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerName="extract" Apr 17 11:24:39.543119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.543024 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerName="extract" Apr 17 11:24:39.543119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.543071 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="45fbae17-37f2-4f12-bd99-9e5f1807ef42" containerName="extract" Apr 17 11:24:39.547070 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.547053 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.549240 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.549220 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8xcdc\"" Apr 17 11:24:39.549581 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.549565 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 11:24:39.549791 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.549775 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 11:24:39.563805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.563784 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p"] Apr 17 11:24:39.667590 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.667553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.667590 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.667595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.667801 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.667630 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvf9\" (UniqueName: \"kubernetes.io/projected/7db61d28-b202-43ee-9fbf-30286b6333ea-kube-api-access-5dvf9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.768376 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.768339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvf9\" (UniqueName: \"kubernetes.io/projected/7db61d28-b202-43ee-9fbf-30286b6333ea-kube-api-access-5dvf9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.768548 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.768442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.768548 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.768467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.768769 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.768753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.768822 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.768802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.804143 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.804077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvf9\" (UniqueName: \"kubernetes.io/projected/7db61d28-b202-43ee-9fbf-30286b6333ea-kube-api-access-5dvf9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:39.856145 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:39.856109 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:40.001461 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.001431 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p"] Apr 17 11:24:40.005442 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:24:40.005414 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db61d28_b202_43ee_9fbf_30286b6333ea.slice/crio-3aac031e082a60353d721772d137b7637107ca6e0f8dfc358a1c8c22168c8c9a WatchSource:0}: Error finding container 3aac031e082a60353d721772d137b7637107ca6e0f8dfc358a1c8c22168c8c9a: Status 404 returned error can't find the container with id 3aac031e082a60353d721772d137b7637107ca6e0f8dfc358a1c8c22168c8c9a Apr 17 11:24:40.588704 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.588668 2577 generic.go:358] "Generic (PLEG): container finished" podID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerID="d1ef37ff81e7a27955d7182cbe71818850d370924db06a79dfa61ebbdb1aed49" exitCode=0 Apr 17 11:24:40.589092 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.588727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" event={"ID":"7db61d28-b202-43ee-9fbf-30286b6333ea","Type":"ContainerDied","Data":"d1ef37ff81e7a27955d7182cbe71818850d370924db06a79dfa61ebbdb1aed49"} Apr 17 11:24:40.589092 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.588754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" event={"ID":"7db61d28-b202-43ee-9fbf-30286b6333ea","Type":"ContainerStarted","Data":"3aac031e082a60353d721772d137b7637107ca6e0f8dfc358a1c8c22168c8c9a"} Apr 17 11:24:40.757325 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.757293 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq"] Apr 17 11:24:40.760342 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.760326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:40.763093 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.763071 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-6dnx8\"" Apr 17 11:24:40.763785 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.763765 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 11:24:40.766010 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.765993 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 11:24:40.781725 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.781702 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq"] Apr 17 11:24:40.877174 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.877084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t69dl\" (UniqueName: \"kubernetes.io/projected/12d57ff6-e6ca-472c-a177-e5799fc04967-kube-api-access-t69dl\") pod \"servicemesh-operator3-55f49c5f94-9gwmq\" (UID: \"12d57ff6-e6ca-472c-a177-e5799fc04967\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:40.877174 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.877124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/12d57ff6-e6ca-472c-a177-e5799fc04967-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9gwmq\" (UID: \"12d57ff6-e6ca-472c-a177-e5799fc04967\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:40.977893 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.977852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t69dl\" (UniqueName: \"kubernetes.io/projected/12d57ff6-e6ca-472c-a177-e5799fc04967-kube-api-access-t69dl\") pod \"servicemesh-operator3-55f49c5f94-9gwmq\" (UID: \"12d57ff6-e6ca-472c-a177-e5799fc04967\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:40.977893 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.977898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/12d57ff6-e6ca-472c-a177-e5799fc04967-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9gwmq\" (UID: \"12d57ff6-e6ca-472c-a177-e5799fc04967\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:40.980637 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.980610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/12d57ff6-e6ca-472c-a177-e5799fc04967-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9gwmq\" (UID: \"12d57ff6-e6ca-472c-a177-e5799fc04967\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:40.986131 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:40.986108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t69dl\" (UniqueName: \"kubernetes.io/projected/12d57ff6-e6ca-472c-a177-e5799fc04967-kube-api-access-t69dl\") pod \"servicemesh-operator3-55f49c5f94-9gwmq\" (UID: \"12d57ff6-e6ca-472c-a177-e5799fc04967\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:41.069173 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:41.069119 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:41.198561 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:41.198529 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq"] Apr 17 11:24:41.202004 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:24:41.201978 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12d57ff6_e6ca_472c_a177_e5799fc04967.slice/crio-43f86abcbe3404141935d22df12fb026ddb7803d32175255a08c6857be0d192e WatchSource:0}: Error finding container 43f86abcbe3404141935d22df12fb026ddb7803d32175255a08c6857be0d192e: Status 404 returned error can't find the container with id 43f86abcbe3404141935d22df12fb026ddb7803d32175255a08c6857be0d192e Apr 17 11:24:41.593386 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:41.593330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" event={"ID":"12d57ff6-e6ca-472c-a177-e5799fc04967","Type":"ContainerStarted","Data":"43f86abcbe3404141935d22df12fb026ddb7803d32175255a08c6857be0d192e"} Apr 17 11:24:41.594907 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:41.594881 2577 generic.go:358] "Generic (PLEG): container finished" podID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerID="ab07e6f732e76c0185fa354c430f1f1ecad3515537f7afda49d7a010f859d163" exitCode=0 Apr 17 11:24:41.595020 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:41.594950 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" event={"ID":"7db61d28-b202-43ee-9fbf-30286b6333ea","Type":"ContainerDied","Data":"ab07e6f732e76c0185fa354c430f1f1ecad3515537f7afda49d7a010f859d163"} Apr 17 11:24:42.604588 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:42.604553 2577 generic.go:358] "Generic (PLEG): container finished" podID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerID="6c3b7549d784aeb4b3cf475f95390fbb325d24d90b5dfeb8d0cf5df729cc591e" exitCode=0 Apr 17 11:24:42.604950 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:42.604595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" event={"ID":"7db61d28-b202-43ee-9fbf-30286b6333ea","Type":"ContainerDied","Data":"6c3b7549d784aeb4b3cf475f95390fbb325d24d90b5dfeb8d0cf5df729cc591e"} Apr 17 11:24:44.543197 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.543175 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:44.620752 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.620718 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" event={"ID":"7db61d28-b202-43ee-9fbf-30286b6333ea","Type":"ContainerDied","Data":"3aac031e082a60353d721772d137b7637107ca6e0f8dfc358a1c8c22168c8c9a"} Apr 17 11:24:44.620752 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.620742 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24d96p" Apr 17 11:24:44.620752 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.620751 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aac031e082a60353d721772d137b7637107ca6e0f8dfc358a1c8c22168c8c9a" Apr 17 11:24:44.717769 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.717739 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dvf9\" (UniqueName: \"kubernetes.io/projected/7db61d28-b202-43ee-9fbf-30286b6333ea-kube-api-access-5dvf9\") pod \"7db61d28-b202-43ee-9fbf-30286b6333ea\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " Apr 17 11:24:44.717972 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.717810 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-util\") pod \"7db61d28-b202-43ee-9fbf-30286b6333ea\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " Apr 17 11:24:44.717972 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.717929 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-bundle\") pod \"7db61d28-b202-43ee-9fbf-30286b6333ea\" (UID: \"7db61d28-b202-43ee-9fbf-30286b6333ea\") " Apr 17 11:24:44.718725 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.718700 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-bundle" (OuterVolumeSpecName: "bundle") pod "7db61d28-b202-43ee-9fbf-30286b6333ea" (UID: "7db61d28-b202-43ee-9fbf-30286b6333ea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:24:44.719965 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.719941 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db61d28-b202-43ee-9fbf-30286b6333ea-kube-api-access-5dvf9" (OuterVolumeSpecName: "kube-api-access-5dvf9") pod "7db61d28-b202-43ee-9fbf-30286b6333ea" (UID: "7db61d28-b202-43ee-9fbf-30286b6333ea"). InnerVolumeSpecName "kube-api-access-5dvf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:24:44.726462 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.726417 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-util" (OuterVolumeSpecName: "util") pod "7db61d28-b202-43ee-9fbf-30286b6333ea" (UID: "7db61d28-b202-43ee-9fbf-30286b6333ea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:24:44.818860 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.818768 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dvf9\" (UniqueName: \"kubernetes.io/projected/7db61d28-b202-43ee-9fbf-30286b6333ea-kube-api-access-5dvf9\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:44.818860 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.818817 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:44.818860 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:44.818836 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7db61d28-b202-43ee-9fbf-30286b6333ea-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:24:45.626102 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:45.626066 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" event={"ID":"12d57ff6-e6ca-472c-a177-e5799fc04967","Type":"ContainerStarted","Data":"9a7db8fd53aa47fdbc3ecc3f1fa453f2ac3997f0b3911c3fdd7a134a8fbd32ac"} Apr 17 11:24:45.626630 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:45.626224 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:45.652088 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:45.652039 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" podStartSLOduration=2.263239292 podStartE2EDuration="5.652020669s" podCreationTimestamp="2026-04-17 11:24:40 +0000 UTC" firstStartedPulling="2026-04-17 11:24:41.204719097 +0000 UTC m=+509.742641307" lastFinishedPulling="2026-04-17 11:24:44.593500472 +0000 UTC m=+513.131422684" observedRunningTime="2026-04-17 11:24:45.65103253 +0000 UTC m=+514.188954774" watchObservedRunningTime="2026-04-17 11:24:45.652020669 +0000 UTC m=+514.189942900" Apr 17 11:24:51.062022 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.061991 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb"] Apr 17 11:24:51.062446 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.062267 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerName="util" Apr 17 11:24:51.062446 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.062277 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerName="util" Apr 17 11:24:51.062446 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.062290 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerName="pull" Apr 17 11:24:51.062446 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.062295 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerName="pull" Apr 17 11:24:51.062446 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.062301 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerName="extract" Apr 17 11:24:51.062446 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.062307 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerName="extract" Apr 17 11:24:51.062446 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.062352 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7db61d28-b202-43ee-9fbf-30286b6333ea" containerName="extract" Apr 17 11:24:51.066960 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.066937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.069011 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.068989 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 11:24:51.069135 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.069039 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 11:24:51.069135 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.069052 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-72qnh\"" Apr 17 11:24:51.069135 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.069065 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 11:24:51.069135 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.068989 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 11:24:51.070565 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.070545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.070636 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.070584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqr77\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-kube-api-access-hqr77\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.070636 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.070619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.070712 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.070650 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18467dd6-5389-4798-afd0-aca868524109-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.070751 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.070715 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.070751 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.070740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.070821 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.070759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18467dd6-5389-4798-afd0-aca868524109-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.077622 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.077590 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb"] Apr 17 11:24:51.171944 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.171909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.171944 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.171949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.172164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.171967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18467dd6-5389-4798-afd0-aca868524109-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.172164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.171994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.172164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.172014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqr77\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-kube-api-access-hqr77\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.172164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.172049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.172164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.172072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18467dd6-5389-4798-afd0-aca868524109-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.172929 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.172886 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18467dd6-5389-4798-afd0-aca868524109-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.174597 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.174575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.174708 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.174600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.174809 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.174790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18467dd6-5389-4798-afd0-aca868524109-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.174973 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.174952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.181072 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.181044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqr77\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-kube-api-access-hqr77\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.181586 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.181568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qndwb\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.379731 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.379629 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:51.515685 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.515645 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb"] Apr 17 11:24:51.521151 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:24:51.521110 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18467dd6_5389_4798_afd0_aca868524109.slice/crio-c49af3cd419880d9c2f71cbca250e7a9497aac8d533661b0de3dac584a82a1f6 WatchSource:0}: Error finding container c49af3cd419880d9c2f71cbca250e7a9497aac8d533661b0de3dac584a82a1f6: Status 404 returned error can't find the container with id c49af3cd419880d9c2f71cbca250e7a9497aac8d533661b0de3dac584a82a1f6 Apr 17 11:24:51.646666 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:51.646581 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" event={"ID":"18467dd6-5389-4798-afd0-aca868524109","Type":"ContainerStarted","Data":"c49af3cd419880d9c2f71cbca250e7a9497aac8d533661b0de3dac584a82a1f6"} Apr 17 11:24:54.232144 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:54.232104 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 17 11:24:54.232447 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:54.232174 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 17 11:24:54.665458 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:54.665344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" event={"ID":"18467dd6-5389-4798-afd0-aca868524109","Type":"ContainerStarted","Data":"feee5208d24d45d8fbf810d30dd678e1d534894f0d918c6f202afe7c8e619fff"} Apr 17 11:24:54.665597 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:54.665504 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:54.685394 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:54.685321 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" podStartSLOduration=0.976415312 podStartE2EDuration="3.685308435s" podCreationTimestamp="2026-04-17 11:24:51 +0000 UTC" firstStartedPulling="2026-04-17 11:24:51.522997083 +0000 UTC m=+520.060919293" lastFinishedPulling="2026-04-17 11:24:54.231890188 +0000 UTC m=+522.769812416" observedRunningTime="2026-04-17 11:24:54.684309361 +0000 UTC m=+523.222231592" watchObservedRunningTime="2026-04-17 11:24:54.685308435 +0000 UTC m=+523.223230667" Apr 17 11:24:55.676131 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:55.676104 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:24:56.632470 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:56.632436 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9gwmq" Apr 17 11:24:58.645714 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.645673 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq"] Apr 17 11:24:58.649034 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.649017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.651226 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.651208 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-cpvwf\"" Apr 17 11:24:58.663254 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.663232 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq"] Apr 17 11:24:58.737981 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.737939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzxx\" (UniqueName: \"kubernetes.io/projected/77c235b0-17b5-4957-9e6c-91eb76b011d6-kube-api-access-lmzxx\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.737981 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.737979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/77c235b0-17b5-4957-9e6c-91eb76b011d6-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.738215 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.738009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.738215 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.738083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.738215 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.738124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.738215 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.738167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.738215 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.738214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.738498 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.738235 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.738498 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.738280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839168 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzxx\" (UniqueName: \"kubernetes.io/projected/77c235b0-17b5-4957-9e6c-91eb76b011d6-kube-api-access-lmzxx\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839395 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/77c235b0-17b5-4957-9e6c-91eb76b011d6-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839395 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839395 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839395 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839326 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839395 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839646 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839646 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839753 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839820 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839891 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.839910 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.839902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.840489 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.840464 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/77c235b0-17b5-4957-9e6c-91eb76b011d6-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.841738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.841716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.841951 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.841927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.846517 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.846499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/77c235b0-17b5-4957-9e6c-91eb76b011d6-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.846637 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.846621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzxx\" (UniqueName: \"kubernetes.io/projected/77c235b0-17b5-4957-9e6c-91eb76b011d6-kube-api-access-lmzxx\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq\" (UID: \"77c235b0-17b5-4957-9e6c-91eb76b011d6\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:58.959308 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:58.959270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:24:59.092079 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:59.092046 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq"] Apr 17 11:24:59.095132 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:24:59.095098 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77c235b0_17b5_4957_9e6c_91eb76b011d6.slice/crio-8aa19bd9a8913c621ed7971c5f5bcaa3b6b37766870d9dd3518d69a5674ef8fe WatchSource:0}: Error finding container 8aa19bd9a8913c621ed7971c5f5bcaa3b6b37766870d9dd3518d69a5674ef8fe: Status 404 returned error can't find the container with id 8aa19bd9a8913c621ed7971c5f5bcaa3b6b37766870d9dd3518d69a5674ef8fe Apr 17 11:24:59.691375 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:24:59.691330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" event={"ID":"77c235b0-17b5-4957-9e6c-91eb76b011d6","Type":"ContainerStarted","Data":"8aa19bd9a8913c621ed7971c5f5bcaa3b6b37766870d9dd3518d69a5674ef8fe"} Apr 17 11:25:01.625758 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:01.625716 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 17 11:25:01.626072 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:01.625793 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 17 11:25:01.626072 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:01.625821 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 17 11:25:01.699900 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:01.699869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" event={"ID":"77c235b0-17b5-4957-9e6c-91eb76b011d6","Type":"ContainerStarted","Data":"8c4f582d2da96160c7da77be5bd30b55a709091c12c9d168c131785ee5e38d16"} Apr 17 11:25:02.727055 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:02.727006 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" podStartSLOduration=2.198855711 podStartE2EDuration="4.726989397s" podCreationTimestamp="2026-04-17 11:24:58 +0000 UTC" firstStartedPulling="2026-04-17 11:24:59.097310058 +0000 UTC m=+527.635232268" lastFinishedPulling="2026-04-17 11:25:01.625443745 +0000 UTC m=+530.163365954" observedRunningTime="2026-04-17 11:25:02.725879413 +0000 UTC m=+531.263801646" watchObservedRunningTime="2026-04-17 11:25:02.726989397 +0000 UTC m=+531.264911629" Apr 17 11:25:02.959465 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:02.959426 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:25:02.964021 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:02.963995 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:25:03.707010 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:03.706978 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:25:03.707931 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:03.707912 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq" Apr 17 11:25:08.477479 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.477444 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv"] Apr 17 11:25:08.482467 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.482450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.484855 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.484832 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 11:25:08.484976 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.484927 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 11:25:08.484976 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.484935 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8xcdc\"" Apr 17 11:25:08.488650 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.488627 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv"] Apr 17 11:25:08.520978 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.520952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.521133 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.520988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vf24\" (UniqueName: \"kubernetes.io/projected/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-kube-api-access-5vf24\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.521133 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.521070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.579406 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.579357 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq"] Apr 17 11:25:08.582838 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.582822 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.590924 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.590899 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq"] Apr 17 11:25:08.621566 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.621526 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.621708 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.621586 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trsqt\" (UniqueName: \"kubernetes.io/projected/6082391e-80a3-4b55-b3b7-ab10578f3c56-kube-api-access-trsqt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.621708 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.621621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.621708 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.621702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.621869 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.621731 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vf24\" (UniqueName: \"kubernetes.io/projected/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-kube-api-access-5vf24\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.621869 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.621774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.622034 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.622016 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.622088 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.622048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.629735 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.629712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vf24\" (UniqueName: \"kubernetes.io/projected/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-kube-api-access-5vf24\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.678018 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.677984 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f"] Apr 17 11:25:08.681590 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.681569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.689769 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.689745 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f"] Apr 17 11:25:08.722683 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.722647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.722849 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.722707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trsqt\" (UniqueName: \"kubernetes.io/projected/6082391e-80a3-4b55-b3b7-ab10578f3c56-kube-api-access-trsqt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.722849 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.722758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqfbc\" (UniqueName: \"kubernetes.io/projected/88460bf3-b65e-42c3-b453-c9fe3bca6c55-kube-api-access-gqfbc\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.722849 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.722792 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.723040 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.722848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.723040 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.722971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.723040 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.723006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.723343 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.723321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.730587 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.730531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trsqt\" (UniqueName: \"kubernetes.io/projected/6082391e-80a3-4b55-b3b7-ab10578f3c56-kube-api-access-trsqt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.781822 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.781786 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp"] Apr 17 11:25:08.785397 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.785381 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.791830 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.791807 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:08.792893 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.792591 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp"] Apr 17 11:25:08.823466 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.823435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqfbc\" (UniqueName: \"kubernetes.io/projected/88460bf3-b65e-42c3-b453-c9fe3bca6c55-kube-api-access-gqfbc\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.823618 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.823484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.823618 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.823519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.823618 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.823544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.823618 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.823579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.823891 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.823676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpd6n\" (UniqueName: \"kubernetes.io/projected/5157fcbc-2523-4106-a368-351d1fd02dc5-kube-api-access-kpd6n\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.823946 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.823896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.823946 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.823913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.831506 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.831477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqfbc\" (UniqueName: \"kubernetes.io/projected/88460bf3-b65e-42c3-b453-c9fe3bca6c55-kube-api-access-gqfbc\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:08.893617 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.893577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:08.925107 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.925075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.925107 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.925121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.925400 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.925261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpd6n\" (UniqueName: \"kubernetes.io/projected/5157fcbc-2523-4106-a368-351d1fd02dc5-kube-api-access-kpd6n\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.925635 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.925507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.925635 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.925558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.932647 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.932626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpd6n\" (UniqueName: \"kubernetes.io/projected/5157fcbc-2523-4106-a368-351d1fd02dc5-kube-api-access-kpd6n\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:08.939754 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.939733 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv"] Apr 17 11:25:08.941971 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:25:08.941944 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e0be0c7_46bd_4c9f_a1ca_5bf16d14d3be.slice/crio-08fe3abba2c1a4ce79d41e5baa29223cad9395869dceb71ad746d2d73b3a46ee WatchSource:0}: Error finding container 08fe3abba2c1a4ce79d41e5baa29223cad9395869dceb71ad746d2d73b3a46ee: Status 404 returned error can't find the container with id 08fe3abba2c1a4ce79d41e5baa29223cad9395869dceb71ad746d2d73b3a46ee Apr 17 11:25:08.991584 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:08.991557 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:09.028527 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.028471 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq"] Apr 17 11:25:09.033303 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:25:09.033261 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6082391e_80a3_4b55_b3b7_ab10578f3c56.slice/crio-79ad652d13dc138b6f0573a50ad3a8df9c266d25005bbe28e57b9a7aadc9ac42 WatchSource:0}: Error finding container 79ad652d13dc138b6f0573a50ad3a8df9c266d25005bbe28e57b9a7aadc9ac42: Status 404 returned error can't find the container with id 79ad652d13dc138b6f0573a50ad3a8df9c266d25005bbe28e57b9a7aadc9ac42 Apr 17 11:25:09.096074 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.096011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:09.126850 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.126711 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f"] Apr 17 11:25:09.128933 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:25:09.128907 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88460bf3_b65e_42c3_b453_c9fe3bca6c55.slice/crio-4aee3cb4e461c33f9a9d38ee334bb79b9841a3f522deac0df8f0e9d3a1b419e0 WatchSource:0}: Error finding container 4aee3cb4e461c33f9a9d38ee334bb79b9841a3f522deac0df8f0e9d3a1b419e0: Status 404 returned error can't find the container with id 4aee3cb4e461c33f9a9d38ee334bb79b9841a3f522deac0df8f0e9d3a1b419e0 Apr 17 11:25:09.244985 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.244958 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp"] Apr 17 11:25:09.294274 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:25:09.294237 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5157fcbc_2523_4106_a368_351d1fd02dc5.slice/crio-898969fae38f30037f1fa1398f19783344a37238abcf2983461a8373e5f7774d WatchSource:0}: Error finding container 898969fae38f30037f1fa1398f19783344a37238abcf2983461a8373e5f7774d: Status 404 returned error can't find the container with id 898969fae38f30037f1fa1398f19783344a37238abcf2983461a8373e5f7774d Apr 17 11:25:09.729296 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.729259 2577 generic.go:358] "Generic (PLEG): container finished" podID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerID="2da5854f99a7c6660c8d228e77bbf50b165adaee74814df52c07599440a52720" exitCode=0 Apr 17 11:25:09.729716 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.729345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" event={"ID":"88460bf3-b65e-42c3-b453-c9fe3bca6c55","Type":"ContainerDied","Data":"2da5854f99a7c6660c8d228e77bbf50b165adaee74814df52c07599440a52720"} Apr 17 11:25:09.729716 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.729406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" event={"ID":"88460bf3-b65e-42c3-b453-c9fe3bca6c55","Type":"ContainerStarted","Data":"4aee3cb4e461c33f9a9d38ee334bb79b9841a3f522deac0df8f0e9d3a1b419e0"} Apr 17 11:25:09.730773 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.730721 2577 generic.go:358] "Generic (PLEG): container finished" podID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerID="178783d0953d29bfb82532c59279ac5b811fcf7f42d0d2320b3e7958f5d92264" exitCode=0 Apr 17 11:25:09.730842 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.730800 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" event={"ID":"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be","Type":"ContainerDied","Data":"178783d0953d29bfb82532c59279ac5b811fcf7f42d0d2320b3e7958f5d92264"} Apr 17 11:25:09.730842 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.730824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" event={"ID":"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be","Type":"ContainerStarted","Data":"08fe3abba2c1a4ce79d41e5baa29223cad9395869dceb71ad746d2d73b3a46ee"} Apr 17 11:25:09.732266 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.732242 2577 generic.go:358] "Generic (PLEG): container finished" podID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerID="067fa3ec4b8a9961fb54c2246844c69cc5112d22e704b530edae83a66a7ccb7d" exitCode=0 Apr 17 11:25:09.732341 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.732323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" event={"ID":"5157fcbc-2523-4106-a368-351d1fd02dc5","Type":"ContainerDied","Data":"067fa3ec4b8a9961fb54c2246844c69cc5112d22e704b530edae83a66a7ccb7d"} Apr 17 11:25:09.732430 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.732385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" event={"ID":"5157fcbc-2523-4106-a368-351d1fd02dc5","Type":"ContainerStarted","Data":"898969fae38f30037f1fa1398f19783344a37238abcf2983461a8373e5f7774d"} Apr 17 11:25:09.734006 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.733983 2577 generic.go:358] "Generic (PLEG): container finished" podID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerID="fcef7487b274f21bc8a63d960910efa55133cd93b776ccdd3e784ba66df98e11" exitCode=0 Apr 17 11:25:09.734108 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.734061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" event={"ID":"6082391e-80a3-4b55-b3b7-ab10578f3c56","Type":"ContainerDied","Data":"fcef7487b274f21bc8a63d960910efa55133cd93b776ccdd3e784ba66df98e11"} Apr 17 11:25:09.734190 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:09.734173 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" event={"ID":"6082391e-80a3-4b55-b3b7-ab10578f3c56","Type":"ContainerStarted","Data":"79ad652d13dc138b6f0573a50ad3a8df9c266d25005bbe28e57b9a7aadc9ac42"} Apr 17 11:25:10.739602 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:10.739568 2577 generic.go:358] "Generic (PLEG): container finished" podID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerID="08f879a1435826e890b3607141311ce8bc4a3ad04b2350b5b226bc9bfd2b2767" exitCode=0 Apr 17 11:25:10.739993 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:10.739651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" event={"ID":"5157fcbc-2523-4106-a368-351d1fd02dc5","Type":"ContainerDied","Data":"08f879a1435826e890b3607141311ce8bc4a3ad04b2350b5b226bc9bfd2b2767"} Apr 17 11:25:10.741259 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:10.741239 2577 generic.go:358] "Generic (PLEG): container finished" podID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerID="0e5d74c9b8fb6d28dc56b5fc54da64b855230f202327528a6aea907ab3a529f3" exitCode=0 Apr 17 11:25:10.741358 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:10.741304 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" event={"ID":"6082391e-80a3-4b55-b3b7-ab10578f3c56","Type":"ContainerDied","Data":"0e5d74c9b8fb6d28dc56b5fc54da64b855230f202327528a6aea907ab3a529f3"} Apr 17 11:25:11.746832 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:11.746799 2577 generic.go:358] "Generic (PLEG): container finished" podID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerID="4de1ab1435546b3ca430c4481d18cd80556f3b515d60adc830ab0e3f8894eb6f" exitCode=0 Apr 17 11:25:11.747241 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:11.746851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" event={"ID":"88460bf3-b65e-42c3-b453-c9fe3bca6c55","Type":"ContainerDied","Data":"4de1ab1435546b3ca430c4481d18cd80556f3b515d60adc830ab0e3f8894eb6f"} Apr 17 11:25:11.748511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:11.748489 2577 generic.go:358] "Generic (PLEG): container finished" podID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerID="331d31a735b3b414ea73ab7499508b9c63b05238005b04a6c496e27f0f87059f" exitCode=0 Apr 17 11:25:11.748631 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:11.748520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" event={"ID":"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be","Type":"ContainerDied","Data":"331d31a735b3b414ea73ab7499508b9c63b05238005b04a6c496e27f0f87059f"} Apr 17 11:25:11.750455 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:11.750438 2577 generic.go:358] "Generic (PLEG): container finished" podID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerID="34c354289d77a129e8a62b09a9af97c64dc12b1ae0540d2cf75216b8563ea533" exitCode=0 Apr 17 11:25:11.750602 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:11.750507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" event={"ID":"5157fcbc-2523-4106-a368-351d1fd02dc5","Type":"ContainerDied","Data":"34c354289d77a129e8a62b09a9af97c64dc12b1ae0540d2cf75216b8563ea533"} Apr 17 11:25:11.752276 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:11.752217 2577 generic.go:358] "Generic (PLEG): container finished" podID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerID="f452592a568955eca6fcdc49ad97253b612e1d0fb0c5c41812e2a73d16af8aea" exitCode=0 Apr 17 11:25:11.752335 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:11.752297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" event={"ID":"6082391e-80a3-4b55-b3b7-ab10578f3c56","Type":"ContainerDied","Data":"f452592a568955eca6fcdc49ad97253b612e1d0fb0c5c41812e2a73d16af8aea"} Apr 17 11:25:12.757213 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.757178 2577 generic.go:358] "Generic (PLEG): container finished" podID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerID="5f86a884eee7ed2c15c7d42766be9124190522badcc67a1c66fa5342e6a31c0a" exitCode=0 Apr 17 11:25:12.757678 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.757249 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" event={"ID":"88460bf3-b65e-42c3-b453-c9fe3bca6c55","Type":"ContainerDied","Data":"5f86a884eee7ed2c15c7d42766be9124190522badcc67a1c66fa5342e6a31c0a"} Apr 17 11:25:12.759060 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.759040 2577 generic.go:358] "Generic (PLEG): container finished" podID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerID="454405a3b0bd740554997c040f4277cbebe8bb61baa3091db02d9b09e76783eb" exitCode=0 Apr 17 11:25:12.759203 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.759112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" event={"ID":"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be","Type":"ContainerDied","Data":"454405a3b0bd740554997c040f4277cbebe8bb61baa3091db02d9b09e76783eb"} Apr 17 11:25:12.903290 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.903268 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:12.906310 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.906290 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:12.960974 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.960943 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-bundle\") pod \"6082391e-80a3-4b55-b3b7-ab10578f3c56\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " Apr 17 11:25:12.961140 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.961040 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-bundle\") pod \"5157fcbc-2523-4106-a368-351d1fd02dc5\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " Apr 17 11:25:12.961140 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.961067 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpd6n\" (UniqueName: \"kubernetes.io/projected/5157fcbc-2523-4106-a368-351d1fd02dc5-kube-api-access-kpd6n\") pod \"5157fcbc-2523-4106-a368-351d1fd02dc5\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " Apr 17 11:25:12.961140 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.961100 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-util\") pod \"6082391e-80a3-4b55-b3b7-ab10578f3c56\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " Apr 17 11:25:12.961140 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.961133 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trsqt\" (UniqueName: \"kubernetes.io/projected/6082391e-80a3-4b55-b3b7-ab10578f3c56-kube-api-access-trsqt\") pod \"6082391e-80a3-4b55-b3b7-ab10578f3c56\" (UID: \"6082391e-80a3-4b55-b3b7-ab10578f3c56\") " Apr 17 11:25:12.961343 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.961155 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-util\") pod \"5157fcbc-2523-4106-a368-351d1fd02dc5\" (UID: \"5157fcbc-2523-4106-a368-351d1fd02dc5\") " Apr 17 11:25:12.961606 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.961575 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-bundle" (OuterVolumeSpecName: "bundle") pod "6082391e-80a3-4b55-b3b7-ab10578f3c56" (UID: "6082391e-80a3-4b55-b3b7-ab10578f3c56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:25:12.961733 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.961693 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-bundle" (OuterVolumeSpecName: "bundle") pod "5157fcbc-2523-4106-a368-351d1fd02dc5" (UID: "5157fcbc-2523-4106-a368-351d1fd02dc5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:25:12.963570 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.963547 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6082391e-80a3-4b55-b3b7-ab10578f3c56-kube-api-access-trsqt" (OuterVolumeSpecName: "kube-api-access-trsqt") pod "6082391e-80a3-4b55-b3b7-ab10578f3c56" (UID: "6082391e-80a3-4b55-b3b7-ab10578f3c56"). InnerVolumeSpecName "kube-api-access-trsqt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:25:12.963681 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.963573 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5157fcbc-2523-4106-a368-351d1fd02dc5-kube-api-access-kpd6n" (OuterVolumeSpecName: "kube-api-access-kpd6n") pod "5157fcbc-2523-4106-a368-351d1fd02dc5" (UID: "5157fcbc-2523-4106-a368-351d1fd02dc5"). InnerVolumeSpecName "kube-api-access-kpd6n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:25:12.966716 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.966690 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-util" (OuterVolumeSpecName: "util") pod "5157fcbc-2523-4106-a368-351d1fd02dc5" (UID: "5157fcbc-2523-4106-a368-351d1fd02dc5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:25:12.966885 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:12.966866 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-util" (OuterVolumeSpecName: "util") pod "6082391e-80a3-4b55-b3b7-ab10578f3c56" (UID: "6082391e-80a3-4b55-b3b7-ab10578f3c56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:25:13.061914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.061830 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:13.061914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.061859 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpd6n\" (UniqueName: \"kubernetes.io/projected/5157fcbc-2523-4106-a368-351d1fd02dc5-kube-api-access-kpd6n\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:13.061914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.061871 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:13.061914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.061882 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trsqt\" (UniqueName: \"kubernetes.io/projected/6082391e-80a3-4b55-b3b7-ab10578f3c56-kube-api-access-trsqt\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:13.061914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.061891 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5157fcbc-2523-4106-a368-351d1fd02dc5-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:13.061914 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.061901 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6082391e-80a3-4b55-b3b7-ab10578f3c56-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:13.764142 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.764106 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" event={"ID":"5157fcbc-2523-4106-a368-351d1fd02dc5","Type":"ContainerDied","Data":"898969fae38f30037f1fa1398f19783344a37238abcf2983461a8373e5f7774d"} Apr 17 11:25:13.764142 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.764130 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30xdmsp" Apr 17 11:25:13.764648 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.764150 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="898969fae38f30037f1fa1398f19783344a37238abcf2983461a8373e5f7774d" Apr 17 11:25:13.765804 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.765775 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" event={"ID":"6082391e-80a3-4b55-b3b7-ab10578f3c56","Type":"ContainerDied","Data":"79ad652d13dc138b6f0573a50ad3a8df9c266d25005bbe28e57b9a7aadc9ac42"} Apr 17 11:25:13.765804 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.765805 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ad652d13dc138b6f0573a50ad3a8df9c266d25005bbe28e57b9a7aadc9ac42" Apr 17 11:25:13.765947 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.765860 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qllhq" Apr 17 11:25:13.905165 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.905134 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:13.928935 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.928836 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:13.970310 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.970278 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-bundle\") pod \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " Apr 17 11:25:13.970488 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.970339 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-util\") pod \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " Apr 17 11:25:13.970488 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.970404 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vf24\" (UniqueName: \"kubernetes.io/projected/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-kube-api-access-5vf24\") pod \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\" (UID: \"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be\") " Apr 17 11:25:13.970881 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.970859 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-bundle" (OuterVolumeSpecName: "bundle") pod "7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" (UID: "7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:25:13.972676 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.972654 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-kube-api-access-5vf24" (OuterVolumeSpecName: "kube-api-access-5vf24") pod "7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" (UID: "7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be"). InnerVolumeSpecName "kube-api-access-5vf24". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:25:13.975834 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:13.975795 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-util" (OuterVolumeSpecName: "util") pod "7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" (UID: "7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:25:14.071509 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.071444 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-util\") pod \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " Apr 17 11:25:14.071509 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.071487 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-bundle\") pod \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " Apr 17 11:25:14.071698 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.071530 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqfbc\" (UniqueName: \"kubernetes.io/projected/88460bf3-b65e-42c3-b453-c9fe3bca6c55-kube-api-access-gqfbc\") pod \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\" (UID: \"88460bf3-b65e-42c3-b453-c9fe3bca6c55\") " Apr 17 11:25:14.071761 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.071721 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:14.071761 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.071739 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vf24\" (UniqueName: \"kubernetes.io/projected/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-kube-api-access-5vf24\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:14.071761 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.071754 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:14.072021 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.071988 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-bundle" (OuterVolumeSpecName: "bundle") pod "88460bf3-b65e-42c3-b453-c9fe3bca6c55" (UID: "88460bf3-b65e-42c3-b453-c9fe3bca6c55"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:25:14.073960 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.073934 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88460bf3-b65e-42c3-b453-c9fe3bca6c55-kube-api-access-gqfbc" (OuterVolumeSpecName: "kube-api-access-gqfbc") pod "88460bf3-b65e-42c3-b453-c9fe3bca6c55" (UID: "88460bf3-b65e-42c3-b453-c9fe3bca6c55"). InnerVolumeSpecName "kube-api-access-gqfbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:25:14.076902 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.076878 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-util" (OuterVolumeSpecName: "util") pod "88460bf3-b65e-42c3-b453-c9fe3bca6c55" (UID: "88460bf3-b65e-42c3-b453-c9fe3bca6c55"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:25:14.172414 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.172350 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-util\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:14.172414 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.172412 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88460bf3-b65e-42c3-b453-c9fe3bca6c55-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:14.172414 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.172422 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqfbc\" (UniqueName: \"kubernetes.io/projected/88460bf3-b65e-42c3-b453-c9fe3bca6c55-kube-api-access-gqfbc\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:14.771718 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.771681 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" event={"ID":"88460bf3-b65e-42c3-b453-c9fe3bca6c55","Type":"ContainerDied","Data":"4aee3cb4e461c33f9a9d38ee334bb79b9841a3f522deac0df8f0e9d3a1b419e0"} Apr 17 11:25:14.771718 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.771716 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aee3cb4e461c33f9a9d38ee334bb79b9841a3f522deac0df8f0e9d3a1b419e0" Apr 17 11:25:14.772178 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.771718 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bwzb8f" Apr 17 11:25:14.773472 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.773433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" event={"ID":"7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be","Type":"ContainerDied","Data":"08fe3abba2c1a4ce79d41e5baa29223cad9395869dceb71ad746d2d73b3a46ee"} Apr 17 11:25:14.773472 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.773458 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08fe3abba2c1a4ce79d41e5baa29223cad9395869dceb71ad746d2d73b3a46ee" Apr 17 11:25:14.773472 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:14.773470 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503z4tmv" Apr 17 11:25:21.556210 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.556173 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b8985c5dc-nxw2k"] Apr 17 11:25:21.557645 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557611 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerName="pull" Apr 17 11:25:21.557645 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557644 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerName="pull" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557655 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerName="extract" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557662 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerName="extract" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557671 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerName="pull" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557677 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerName="pull" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557685 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerName="extract" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557690 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerName="extract" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557706 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerName="util" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557714 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerName="util" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557726 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerName="extract" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557731 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerName="extract" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557739 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerName="extract" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557744 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerName="extract" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557749 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerName="util" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557754 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerName="util" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557762 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerName="util" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557767 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerName="util" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557772 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerName="pull" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557777 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerName="pull" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557784 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerName="util" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557793 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerName="util" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557803 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerName="pull" Apr 17 11:25:21.557805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557809 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerName="pull" Apr 17 11:25:21.558572 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557877 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6082391e-80a3-4b55-b3b7-ab10578f3c56" containerName="extract" Apr 17 11:25:21.558572 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557887 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="88460bf3-b65e-42c3-b453-c9fe3bca6c55" containerName="extract" Apr 17 11:25:21.558572 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557894 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5157fcbc-2523-4106-a368-351d1fd02dc5" containerName="extract" Apr 17 11:25:21.558572 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.557899 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e0be0c7-46bd-4c9f-a1ca-5bf16d14d3be" containerName="extract" Apr 17 11:25:21.568900 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.568874 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b8985c5dc-nxw2k"] Apr 17 11:25:21.568998 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.568975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.636382 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.636337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lttnv\" (UniqueName: \"kubernetes.io/projected/c68de20a-f162-46df-a719-461537d94ab4-kube-api-access-lttnv\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.636511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.636388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-console-config\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.636511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.636446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c68de20a-f162-46df-a719-461537d94ab4-console-serving-cert\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.636511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.636485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c68de20a-f162-46df-a719-461537d94ab4-console-oauth-config\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.636611 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.636517 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-service-ca\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.636611 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.636590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-trusted-ca-bundle\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.636672 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.636635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-oauth-serving-cert\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.737773 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.737734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-service-ca\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.737930 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.737781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-trusted-ca-bundle\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.737930 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.737904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-oauth-serving-cert\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.738013 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.737959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lttnv\" (UniqueName: \"kubernetes.io/projected/c68de20a-f162-46df-a719-461537d94ab4-kube-api-access-lttnv\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.738013 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.737994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-console-config\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.738089 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.738040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c68de20a-f162-46df-a719-461537d94ab4-console-serving-cert\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.738089 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.738062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c68de20a-f162-46df-a719-461537d94ab4-console-oauth-config\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.738499 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.738463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-service-ca\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.738693 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.738666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-oauth-serving-cert\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.738789 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.738708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-trusted-ca-bundle\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.738789 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.738718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c68de20a-f162-46df-a719-461537d94ab4-console-config\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.740671 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.740641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c68de20a-f162-46df-a719-461537d94ab4-console-oauth-config\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.740844 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.740825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c68de20a-f162-46df-a719-461537d94ab4-console-serving-cert\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.753874 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.753855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lttnv\" (UniqueName: \"kubernetes.io/projected/c68de20a-f162-46df-a719-461537d94ab4-kube-api-access-lttnv\") pod \"console-5b8985c5dc-nxw2k\" (UID: \"c68de20a-f162-46df-a719-461537d94ab4\") " pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:21.879634 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:21.879536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:22.011329 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.011267 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b8985c5dc-nxw2k"] Apr 17 11:25:22.013583 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:25:22.013555 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68de20a_f162_46df_a719_461537d94ab4.slice/crio-c6751eaed899752545a4c784bf887c6021f8785e6dcc58c44cc558e0c71c0aed WatchSource:0}: Error finding container c6751eaed899752545a4c784bf887c6021f8785e6dcc58c44cc558e0c71c0aed: Status 404 returned error can't find the container with id c6751eaed899752545a4c784bf887c6021f8785e6dcc58c44cc558e0c71c0aed Apr 17 11:25:22.417799 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.417762 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v"] Apr 17 11:25:22.420867 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.420851 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" Apr 17 11:25:22.423522 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.423497 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 11:25:22.423808 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.423791 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-qjhbl\"" Apr 17 11:25:22.423808 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.423805 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 11:25:22.432747 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.432726 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v"] Apr 17 11:25:22.544210 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.544171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdg7b\" (UniqueName: \"kubernetes.io/projected/0dff2bcc-dd83-4dfd-921a-c93d43c03722-kube-api-access-kdg7b\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rwb7v\" (UID: \"0dff2bcc-dd83-4dfd-921a-c93d43c03722\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" Apr 17 11:25:22.645447 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.645406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdg7b\" (UniqueName: \"kubernetes.io/projected/0dff2bcc-dd83-4dfd-921a-c93d43c03722-kube-api-access-kdg7b\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rwb7v\" (UID: \"0dff2bcc-dd83-4dfd-921a-c93d43c03722\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" Apr 17 11:25:22.653487 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.653462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdg7b\" (UniqueName: \"kubernetes.io/projected/0dff2bcc-dd83-4dfd-921a-c93d43c03722-kube-api-access-kdg7b\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rwb7v\" (UID: \"0dff2bcc-dd83-4dfd-921a-c93d43c03722\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" Apr 17 11:25:22.731221 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.731188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" Apr 17 11:25:22.803069 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.803037 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b8985c5dc-nxw2k" event={"ID":"c68de20a-f162-46df-a719-461537d94ab4","Type":"ContainerStarted","Data":"ca8d7602ef4bf04daca363d856f3c6afbae54b6db570859de09d73c0fc83e08b"} Apr 17 11:25:22.803230 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.803076 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b8985c5dc-nxw2k" event={"ID":"c68de20a-f162-46df-a719-461537d94ab4","Type":"ContainerStarted","Data":"c6751eaed899752545a4c784bf887c6021f8785e6dcc58c44cc558e0c71c0aed"} Apr 17 11:25:22.824100 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.824043 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b8985c5dc-nxw2k" podStartSLOduration=1.8240234279999998 podStartE2EDuration="1.824023428s" podCreationTimestamp="2026-04-17 11:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:25:22.822310005 +0000 UTC m=+551.360232437" watchObservedRunningTime="2026-04-17 11:25:22.824023428 +0000 UTC m=+551.361945661" Apr 17 11:25:22.860995 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:22.860960 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v"] Apr 17 11:25:22.864142 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:25:22.864100 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dff2bcc_dd83_4dfd_921a_c93d43c03722.slice/crio-5a43ebbdaff8c7dca1625385d5ffe7db4c7875eacb5b0484cc8478bc7675f907 WatchSource:0}: Error finding container 5a43ebbdaff8c7dca1625385d5ffe7db4c7875eacb5b0484cc8478bc7675f907: Status 404 returned error can't find the container with id 5a43ebbdaff8c7dca1625385d5ffe7db4c7875eacb5b0484cc8478bc7675f907 Apr 17 11:25:23.808891 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:23.808856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" event={"ID":"0dff2bcc-dd83-4dfd-921a-c93d43c03722","Type":"ContainerStarted","Data":"5a43ebbdaff8c7dca1625385d5ffe7db4c7875eacb5b0484cc8478bc7675f907"} Apr 17 11:25:25.818532 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:25.818491 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" event={"ID":"0dff2bcc-dd83-4dfd-921a-c93d43c03722","Type":"ContainerStarted","Data":"1848f23901d2d81ae9fac0e87c4714db306c248d320b485cc09c773117549f66"} Apr 17 11:25:25.818907 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:25.818618 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" Apr 17 11:25:25.848723 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:25.848677 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" podStartSLOduration=1.41964353 podStartE2EDuration="3.848664566s" podCreationTimestamp="2026-04-17 11:25:22 +0000 UTC" firstStartedPulling="2026-04-17 11:25:22.866602964 +0000 UTC m=+551.404525174" lastFinishedPulling="2026-04-17 11:25:25.295623999 +0000 UTC m=+553.833546210" observedRunningTime="2026-04-17 11:25:25.845291649 +0000 UTC m=+554.383213881" watchObservedRunningTime="2026-04-17 11:25:25.848664566 +0000 UTC m=+554.386586797" Apr 17 11:25:28.757884 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:28.757848 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6"] Apr 17 11:25:28.765891 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:28.765865 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:28.768437 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:28.768419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-zcj4x\"" Apr 17 11:25:28.782552 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:28.782526 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6"] Apr 17 11:25:28.902343 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:28.902309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw8g7\" (UniqueName: \"kubernetes.io/projected/9e4787cd-4dfd-4599-84a6-cf70483badf1-kube-api-access-zw8g7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-rdcm6\" (UID: \"9e4787cd-4dfd-4599-84a6-cf70483badf1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:28.902343 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:28.902348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e4787cd-4dfd-4599-84a6-cf70483badf1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-rdcm6\" (UID: \"9e4787cd-4dfd-4599-84a6-cf70483badf1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:29.003333 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:29.003296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw8g7\" (UniqueName: \"kubernetes.io/projected/9e4787cd-4dfd-4599-84a6-cf70483badf1-kube-api-access-zw8g7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-rdcm6\" (UID: \"9e4787cd-4dfd-4599-84a6-cf70483badf1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:29.003333 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:29.003333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e4787cd-4dfd-4599-84a6-cf70483badf1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-rdcm6\" (UID: \"9e4787cd-4dfd-4599-84a6-cf70483badf1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:29.003690 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:29.003656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e4787cd-4dfd-4599-84a6-cf70483badf1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-rdcm6\" (UID: \"9e4787cd-4dfd-4599-84a6-cf70483badf1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:29.011437 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:29.011356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw8g7\" (UniqueName: \"kubernetes.io/projected/9e4787cd-4dfd-4599-84a6-cf70483badf1-kube-api-access-zw8g7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-rdcm6\" (UID: \"9e4787cd-4dfd-4599-84a6-cf70483badf1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:29.075137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:29.075107 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:29.229623 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:29.229601 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6"] Apr 17 11:25:29.230422 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:25:29.230401 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e4787cd_4dfd_4599_84a6_cf70483badf1.slice/crio-c40c232ca0e326914a9b7308f68b1af6b552568876453ac48d4bef09115274f9 WatchSource:0}: Error finding container c40c232ca0e326914a9b7308f68b1af6b552568876453ac48d4bef09115274f9: Status 404 returned error can't find the container with id c40c232ca0e326914a9b7308f68b1af6b552568876453ac48d4bef09115274f9 Apr 17 11:25:29.834378 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:29.834329 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" event={"ID":"9e4787cd-4dfd-4599-84a6-cf70483badf1","Type":"ContainerStarted","Data":"c40c232ca0e326914a9b7308f68b1af6b552568876453ac48d4bef09115274f9"} Apr 17 11:25:31.880456 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:31.880403 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:31.880932 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:31.880468 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:31.888775 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:31.888708 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:32.853017 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:32.852989 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b8985c5dc-nxw2k" Apr 17 11:25:32.902125 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:32.902091 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-594f5485b8-z4tx4"] Apr 17 11:25:33.853020 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:33.852979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" event={"ID":"9e4787cd-4dfd-4599-84a6-cf70483badf1","Type":"ContainerStarted","Data":"6e1891d1f2a2a16d358607c63cedca99b69652c7e22b0738bd52ac0ff4beec5e"} Apr 17 11:25:33.853205 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:33.853042 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:33.875216 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:33.875164 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" podStartSLOduration=1.346079896 podStartE2EDuration="5.875138439s" podCreationTimestamp="2026-04-17 11:25:28 +0000 UTC" firstStartedPulling="2026-04-17 11:25:29.233878017 +0000 UTC m=+557.771800240" lastFinishedPulling="2026-04-17 11:25:33.76293657 +0000 UTC m=+562.300858783" observedRunningTime="2026-04-17 11:25:33.872245374 +0000 UTC m=+562.410167616" watchObservedRunningTime="2026-04-17 11:25:33.875138439 +0000 UTC m=+562.413060672" Apr 17 11:25:36.825226 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:36.825188 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rwb7v" Apr 17 11:25:44.864816 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:44.864776 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-rdcm6" Apr 17 11:25:57.928255 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:57.928195 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-594f5485b8-z4tx4" podUID="3dd97aa1-6d9e-4f68-ad38-7aea6147d061" containerName="console" containerID="cri-o://b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d" gracePeriod=15 Apr 17 11:25:58.035782 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.035737 2577 patch_prober.go:28] interesting pod/console-594f5485b8-z4tx4 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" start-of-body= Apr 17 11:25:58.035965 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.035815 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-594f5485b8-z4tx4" podUID="3dd97aa1-6d9e-4f68-ad38-7aea6147d061" containerName="console" probeResult="failure" output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" Apr 17 11:25:58.179279 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.179216 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-594f5485b8-z4tx4_3dd97aa1-6d9e-4f68-ad38-7aea6147d061/console/0.log" Apr 17 11:25:58.179401 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.179279 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:25:58.247530 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.247498 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-service-ca\") pod \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " Apr 17 11:25:58.247530 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.247532 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-oauth-config\") pod \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " Apr 17 11:25:58.247719 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.247550 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-config\") pod \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " Apr 17 11:25:58.247719 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.247616 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgn55\" (UniqueName: \"kubernetes.io/projected/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-kube-api-access-pgn55\") pod \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " Apr 17 11:25:58.247719 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.247664 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-trusted-ca-bundle\") pod \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " Apr 17 11:25:58.247849 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.247726 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-oauth-serving-cert\") pod \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " Apr 17 11:25:58.247849 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.247772 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-serving-cert\") pod \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\" (UID: \"3dd97aa1-6d9e-4f68-ad38-7aea6147d061\") " Apr 17 11:25:58.247953 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.247876 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-service-ca" (OuterVolumeSpecName: "service-ca") pod "3dd97aa1-6d9e-4f68-ad38-7aea6147d061" (UID: "3dd97aa1-6d9e-4f68-ad38-7aea6147d061"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:25:58.248119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.248085 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-config" (OuterVolumeSpecName: "console-config") pod "3dd97aa1-6d9e-4f68-ad38-7aea6147d061" (UID: "3dd97aa1-6d9e-4f68-ad38-7aea6147d061"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:25:58.248119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.248096 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3dd97aa1-6d9e-4f68-ad38-7aea6147d061" (UID: "3dd97aa1-6d9e-4f68-ad38-7aea6147d061"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:25:58.248335 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.248149 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-service-ca\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:58.248335 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.248166 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3dd97aa1-6d9e-4f68-ad38-7aea6147d061" (UID: "3dd97aa1-6d9e-4f68-ad38-7aea6147d061"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:25:58.249777 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.249755 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3dd97aa1-6d9e-4f68-ad38-7aea6147d061" (UID: "3dd97aa1-6d9e-4f68-ad38-7aea6147d061"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:25:58.249890 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.249869 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3dd97aa1-6d9e-4f68-ad38-7aea6147d061" (UID: "3dd97aa1-6d9e-4f68-ad38-7aea6147d061"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:25:58.249890 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.249872 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-kube-api-access-pgn55" (OuterVolumeSpecName: "kube-api-access-pgn55") pod "3dd97aa1-6d9e-4f68-ad38-7aea6147d061" (UID: "3dd97aa1-6d9e-4f68-ad38-7aea6147d061"). InnerVolumeSpecName "kube-api-access-pgn55". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:25:58.348729 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.348682 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgn55\" (UniqueName: \"kubernetes.io/projected/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-kube-api-access-pgn55\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:58.348729 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.348719 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-trusted-ca-bundle\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:58.348729 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.348728 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-oauth-serving-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:58.348729 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.348738 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-serving-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:58.349031 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.348747 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-oauth-config\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:58.349031 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.348756 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3dd97aa1-6d9e-4f68-ad38-7aea6147d061-console-config\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:25:58.948973 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.948941 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-594f5485b8-z4tx4_3dd97aa1-6d9e-4f68-ad38-7aea6147d061/console/0.log" Apr 17 11:25:58.949423 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.948982 2577 generic.go:358] "Generic (PLEG): container finished" podID="3dd97aa1-6d9e-4f68-ad38-7aea6147d061" containerID="b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d" exitCode=2 Apr 17 11:25:58.949423 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.949014 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-594f5485b8-z4tx4" event={"ID":"3dd97aa1-6d9e-4f68-ad38-7aea6147d061","Type":"ContainerDied","Data":"b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d"} Apr 17 11:25:58.949423 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.949050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-594f5485b8-z4tx4" event={"ID":"3dd97aa1-6d9e-4f68-ad38-7aea6147d061","Type":"ContainerDied","Data":"14b9394be3ea16e3e3afa60ba91c661ff3ddbfd84489fd84383c5a7884303488"} Apr 17 11:25:58.949423 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.949058 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-594f5485b8-z4tx4" Apr 17 11:25:58.949423 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.949066 2577 scope.go:117] "RemoveContainer" containerID="b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d" Apr 17 11:25:58.957995 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.957979 2577 scope.go:117] "RemoveContainer" containerID="b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d" Apr 17 11:25:58.958235 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:25:58.958217 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d\": container with ID starting with b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d not found: ID does not exist" containerID="b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d" Apr 17 11:25:58.958284 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.958243 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d"} err="failed to get container status \"b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d\": rpc error: code = NotFound desc = could not find container \"b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d\": container with ID starting with b3d4c5af590147e4a900dc5aacfb21fe8dd6995f40af651687578fabe90ee83d not found: ID does not exist" Apr 17 11:25:58.969918 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.969892 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-594f5485b8-z4tx4"] Apr 17 11:25:58.974218 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:25:58.974195 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-594f5485b8-z4tx4"] Apr 17 11:26:00.066157 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:00.066123 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd97aa1-6d9e-4f68-ad38-7aea6147d061" path="/var/lib/kubelet/pods/3dd97aa1-6d9e-4f68-ad38-7aea6147d061/volumes" Apr 17 11:26:11.949859 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:11.949835 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:26:11.957505 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:11.957279 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:26:18.883645 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.883609 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-88szw"] Apr 17 11:26:18.884164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.883906 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dd97aa1-6d9e-4f68-ad38-7aea6147d061" containerName="console" Apr 17 11:26:18.884164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.883918 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd97aa1-6d9e-4f68-ad38-7aea6147d061" containerName="console" Apr 17 11:26:18.884164 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.884020 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dd97aa1-6d9e-4f68-ad38-7aea6147d061" containerName="console" Apr 17 11:26:18.890379 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.890343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:18.892796 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.892773 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 11:26:18.893166 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.892773 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-lcs5v\"" Apr 17 11:26:18.894615 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.894593 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-88szw"] Apr 17 11:26:18.920249 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:18.920210 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-88szw"] Apr 17 11:26:19.023475 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:19.023436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/717c4b78-672e-4ce2-bf9c-92108ff3a520-config-file\") pod \"limitador-limitador-67566c68b4-88szw\" (UID: \"717c4b78-672e-4ce2-bf9c-92108ff3a520\") " pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:19.023649 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:19.023516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg7j7\" (UniqueName: \"kubernetes.io/projected/717c4b78-672e-4ce2-bf9c-92108ff3a520-kube-api-access-sg7j7\") pod \"limitador-limitador-67566c68b4-88szw\" (UID: \"717c4b78-672e-4ce2-bf9c-92108ff3a520\") " pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:19.124138 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:19.124101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg7j7\" (UniqueName: \"kubernetes.io/projected/717c4b78-672e-4ce2-bf9c-92108ff3a520-kube-api-access-sg7j7\") pod \"limitador-limitador-67566c68b4-88szw\" (UID: \"717c4b78-672e-4ce2-bf9c-92108ff3a520\") " pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:19.124311 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:19.124227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/717c4b78-672e-4ce2-bf9c-92108ff3a520-config-file\") pod \"limitador-limitador-67566c68b4-88szw\" (UID: \"717c4b78-672e-4ce2-bf9c-92108ff3a520\") " pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:19.124878 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:19.124858 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/717c4b78-672e-4ce2-bf9c-92108ff3a520-config-file\") pod \"limitador-limitador-67566c68b4-88szw\" (UID: \"717c4b78-672e-4ce2-bf9c-92108ff3a520\") " pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:19.132567 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:19.132539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg7j7\" (UniqueName: \"kubernetes.io/projected/717c4b78-672e-4ce2-bf9c-92108ff3a520-kube-api-access-sg7j7\") pod \"limitador-limitador-67566c68b4-88szw\" (UID: \"717c4b78-672e-4ce2-bf9c-92108ff3a520\") " pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:19.202528 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:19.202494 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:19.331549 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:19.331522 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-88szw"] Apr 17 11:26:19.333207 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:26:19.333179 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod717c4b78_672e_4ce2_bf9c_92108ff3a520.slice/crio-a3cf658f4719fc63e556af2ac0df4c01e62a36b1215559b797efcb8ab9b16e9a WatchSource:0}: Error finding container a3cf658f4719fc63e556af2ac0df4c01e62a36b1215559b797efcb8ab9b16e9a: Status 404 returned error can't find the container with id a3cf658f4719fc63e556af2ac0df4c01e62a36b1215559b797efcb8ab9b16e9a Apr 17 11:26:20.025924 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:20.025892 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" event={"ID":"717c4b78-672e-4ce2-bf9c-92108ff3a520","Type":"ContainerStarted","Data":"a3cf658f4719fc63e556af2ac0df4c01e62a36b1215559b797efcb8ab9b16e9a"} Apr 17 11:26:24.046783 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:24.046742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" event={"ID":"717c4b78-672e-4ce2-bf9c-92108ff3a520","Type":"ContainerStarted","Data":"8b7ce4af4d9eb00732d1fe4821fe87ff60f57749b34b24c7dc7fefd76dad01e8"} Apr 17 11:26:24.047196 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:24.046865 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:24.064023 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:24.063972 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" podStartSLOduration=2.358573988 podStartE2EDuration="6.063958002s" podCreationTimestamp="2026-04-17 11:26:18 +0000 UTC" firstStartedPulling="2026-04-17 11:26:19.335016641 +0000 UTC m=+607.872938851" lastFinishedPulling="2026-04-17 11:26:23.040400654 +0000 UTC m=+611.578322865" observedRunningTime="2026-04-17 11:26:24.062472337 +0000 UTC m=+612.600394571" watchObservedRunningTime="2026-04-17 11:26:24.063958002 +0000 UTC m=+612.601880234" Apr 17 11:26:35.053580 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:35.053503 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-88szw" Apr 17 11:26:58.948678 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:58.948639 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb"] Apr 17 11:26:58.949206 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:58.948878 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" podUID="18467dd6-5389-4798-afd0-aca868524109" containerName="discovery" containerID="cri-o://feee5208d24d45d8fbf810d30dd678e1d534894f0d918c6f202afe7c8e619fff" gracePeriod=30 Apr 17 11:26:59.176341 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.176308 2577 generic.go:358] "Generic (PLEG): container finished" podID="18467dd6-5389-4798-afd0-aca868524109" containerID="feee5208d24d45d8fbf810d30dd678e1d534894f0d918c6f202afe7c8e619fff" exitCode=0 Apr 17 11:26:59.176531 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.176378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" event={"ID":"18467dd6-5389-4798-afd0-aca868524109","Type":"ContainerDied","Data":"feee5208d24d45d8fbf810d30dd678e1d534894f0d918c6f202afe7c8e619fff"} Apr 17 11:26:59.204960 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.204896 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:26:59.274141 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.274105 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-csr-dns-cert\") pod \"18467dd6-5389-4798-afd0-aca868524109\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " Apr 17 11:26:59.274141 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.274147 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-istio-token\") pod \"18467dd6-5389-4798-afd0-aca868524109\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " Apr 17 11:26:59.274427 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.274187 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18467dd6-5389-4798-afd0-aca868524109-local-certs\") pod \"18467dd6-5389-4798-afd0-aca868524109\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " Apr 17 11:26:59.274427 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.274213 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18467dd6-5389-4798-afd0-aca868524109-istio-csr-ca-configmap\") pod \"18467dd6-5389-4798-afd0-aca868524109\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " Apr 17 11:26:59.274427 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.274290 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqr77\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-kube-api-access-hqr77\") pod \"18467dd6-5389-4798-afd0-aca868524109\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " Apr 17 11:26:59.274427 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.274329 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-kubeconfig\") pod \"18467dd6-5389-4798-afd0-aca868524109\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " Apr 17 11:26:59.274427 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.274408 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-cacerts\") pod \"18467dd6-5389-4798-afd0-aca868524109\" (UID: \"18467dd6-5389-4798-afd0-aca868524109\") " Apr 17 11:26:59.274732 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.274696 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18467dd6-5389-4798-afd0-aca868524109-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "18467dd6-5389-4798-afd0-aca868524109" (UID: "18467dd6-5389-4798-afd0-aca868524109"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:59.276901 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.276860 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "18467dd6-5389-4798-afd0-aca868524109" (UID: "18467dd6-5389-4798-afd0-aca868524109"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:26:59.277099 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.276945 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18467dd6-5389-4798-afd0-aca868524109-local-certs" (OuterVolumeSpecName: "local-certs") pod "18467dd6-5389-4798-afd0-aca868524109" (UID: "18467dd6-5389-4798-afd0-aca868524109"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:26:59.277323 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.277291 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "18467dd6-5389-4798-afd0-aca868524109" (UID: "18467dd6-5389-4798-afd0-aca868524109"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:26:59.277323 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.277308 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-cacerts" (OuterVolumeSpecName: "cacerts") pod "18467dd6-5389-4798-afd0-aca868524109" (UID: "18467dd6-5389-4798-afd0-aca868524109"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:26:59.277501 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.277482 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-istio-token" (OuterVolumeSpecName: "istio-token") pod "18467dd6-5389-4798-afd0-aca868524109" (UID: "18467dd6-5389-4798-afd0-aca868524109"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:59.277540 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.277504 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-kube-api-access-hqr77" (OuterVolumeSpecName: "kube-api-access-hqr77") pod "18467dd6-5389-4798-afd0-aca868524109" (UID: "18467dd6-5389-4798-afd0-aca868524109"). InnerVolumeSpecName "kube-api-access-hqr77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:59.375142 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.375095 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hqr77\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-kube-api-access-hqr77\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:26:59.375142 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.375127 2577 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-kubeconfig\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:26:59.375142 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.375137 2577 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-cacerts\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:26:59.375142 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.375145 2577 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/18467dd6-5389-4798-afd0-aca868524109-istio-csr-dns-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:26:59.375513 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.375165 2577 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/18467dd6-5389-4798-afd0-aca868524109-istio-token\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:26:59.375513 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.375174 2577 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/18467dd6-5389-4798-afd0-aca868524109-local-certs\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:26:59.375513 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:26:59.375182 2577 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/18467dd6-5389-4798-afd0-aca868524109-istio-csr-ca-configmap\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:27:00.181077 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:00.181042 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" Apr 17 11:27:00.181077 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:00.181061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb" event={"ID":"18467dd6-5389-4798-afd0-aca868524109","Type":"ContainerDied","Data":"c49af3cd419880d9c2f71cbca250e7a9497aac8d533661b0de3dac584a82a1f6"} Apr 17 11:27:00.181591 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:00.181108 2577 scope.go:117] "RemoveContainer" containerID="feee5208d24d45d8fbf810d30dd678e1d534894f0d918c6f202afe7c8e619fff" Apr 17 11:27:00.198805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:00.198780 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb"] Apr 17 11:27:00.204842 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:00.204812 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qndwb"] Apr 17 11:27:02.066799 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:02.066766 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18467dd6-5389-4798-afd0-aca868524109" path="/var/lib/kubelet/pods/18467dd6-5389-4798-afd0-aca868524109/volumes" Apr 17 11:27:03.525785 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.525748 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-r9hnx"] Apr 17 11:27:03.526174 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.526109 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18467dd6-5389-4798-afd0-aca868524109" containerName="discovery" Apr 17 11:27:03.526174 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.526121 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="18467dd6-5389-4798-afd0-aca868524109" containerName="discovery" Apr 17 11:27:03.526242 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.526183 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="18467dd6-5389-4798-afd0-aca868524109" containerName="discovery" Apr 17 11:27:03.528962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.528945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:03.533738 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.533718 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 11:27:03.535120 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.535095 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 11:27:03.535239 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.535144 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-mpps7\"" Apr 17 11:27:03.535336 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.535287 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 11:27:03.538517 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.538495 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-r9hnx"] Apr 17 11:27:03.546842 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.546821 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf"] Apr 17 11:27:03.550157 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.550138 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:03.552872 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.552853 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 11:27:03.552963 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.552892 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-9q4xk\"" Apr 17 11:27:03.575880 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.575848 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf"] Apr 17 11:27:03.611481 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.611451 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/973ef5f8-86db-4e45-8197-f0c91a379e10-cert\") pod \"kserve-controller-manager-7dcb9f9f85-r9hnx\" (UID: \"973ef5f8-86db-4e45-8197-f0c91a379e10\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:03.611665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.611523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad4c06e7-e478-40cd-ac98-906086b21c59-cert\") pod \"llmisvc-controller-manager-5d56b85c4d-2gtxf\" (UID: \"ad4c06e7-e478-40cd-ac98-906086b21c59\") " pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:03.611665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.611555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9962t\" (UniqueName: \"kubernetes.io/projected/973ef5f8-86db-4e45-8197-f0c91a379e10-kube-api-access-9962t\") pod \"kserve-controller-manager-7dcb9f9f85-r9hnx\" (UID: \"973ef5f8-86db-4e45-8197-f0c91a379e10\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:03.611665 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.611630 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8llvj\" (UniqueName: \"kubernetes.io/projected/ad4c06e7-e478-40cd-ac98-906086b21c59-kube-api-access-8llvj\") pod \"llmisvc-controller-manager-5d56b85c4d-2gtxf\" (UID: \"ad4c06e7-e478-40cd-ac98-906086b21c59\") " pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:03.615131 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.615103 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-ntxzk"] Apr 17 11:27:03.618056 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.618042 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:03.620353 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.620329 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-zzzlx\"" Apr 17 11:27:03.620714 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.620700 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 11:27:03.638745 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.638719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-ntxzk"] Apr 17 11:27:03.712005 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.711976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad4c06e7-e478-40cd-ac98-906086b21c59-cert\") pod \"llmisvc-controller-manager-5d56b85c4d-2gtxf\" (UID: \"ad4c06e7-e478-40cd-ac98-906086b21c59\") " pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:03.712167 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.712020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf9m8\" (UniqueName: \"kubernetes.io/projected/aecdfa6c-405a-4826-b85c-ed166499eb7e-kube-api-access-lf9m8\") pod \"seaweedfs-86cc847c5c-ntxzk\" (UID: \"aecdfa6c-405a-4826-b85c-ed166499eb7e\") " pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:03.712167 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.712044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9962t\" (UniqueName: \"kubernetes.io/projected/973ef5f8-86db-4e45-8197-f0c91a379e10-kube-api-access-9962t\") pod \"kserve-controller-manager-7dcb9f9f85-r9hnx\" (UID: \"973ef5f8-86db-4e45-8197-f0c91a379e10\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:03.712167 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.712062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8llvj\" (UniqueName: \"kubernetes.io/projected/ad4c06e7-e478-40cd-ac98-906086b21c59-kube-api-access-8llvj\") pod \"llmisvc-controller-manager-5d56b85c4d-2gtxf\" (UID: \"ad4c06e7-e478-40cd-ac98-906086b21c59\") " pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:03.712167 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.712084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/973ef5f8-86db-4e45-8197-f0c91a379e10-cert\") pod \"kserve-controller-manager-7dcb9f9f85-r9hnx\" (UID: \"973ef5f8-86db-4e45-8197-f0c91a379e10\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:03.712167 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.712114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/aecdfa6c-405a-4826-b85c-ed166499eb7e-data\") pod \"seaweedfs-86cc847c5c-ntxzk\" (UID: \"aecdfa6c-405a-4826-b85c-ed166499eb7e\") " pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:03.714657 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.714627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/973ef5f8-86db-4e45-8197-f0c91a379e10-cert\") pod \"kserve-controller-manager-7dcb9f9f85-r9hnx\" (UID: \"973ef5f8-86db-4e45-8197-f0c91a379e10\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:03.714768 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.714627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad4c06e7-e478-40cd-ac98-906086b21c59-cert\") pod \"llmisvc-controller-manager-5d56b85c4d-2gtxf\" (UID: \"ad4c06e7-e478-40cd-ac98-906086b21c59\") " pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:03.720450 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.720431 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llvj\" (UniqueName: \"kubernetes.io/projected/ad4c06e7-e478-40cd-ac98-906086b21c59-kube-api-access-8llvj\") pod \"llmisvc-controller-manager-5d56b85c4d-2gtxf\" (UID: \"ad4c06e7-e478-40cd-ac98-906086b21c59\") " pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:03.720538 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.720431 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9962t\" (UniqueName: \"kubernetes.io/projected/973ef5f8-86db-4e45-8197-f0c91a379e10-kube-api-access-9962t\") pod \"kserve-controller-manager-7dcb9f9f85-r9hnx\" (UID: \"973ef5f8-86db-4e45-8197-f0c91a379e10\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:03.812860 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.812757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf9m8\" (UniqueName: \"kubernetes.io/projected/aecdfa6c-405a-4826-b85c-ed166499eb7e-kube-api-access-lf9m8\") pod \"seaweedfs-86cc847c5c-ntxzk\" (UID: \"aecdfa6c-405a-4826-b85c-ed166499eb7e\") " pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:03.812860 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.812846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/aecdfa6c-405a-4826-b85c-ed166499eb7e-data\") pod \"seaweedfs-86cc847c5c-ntxzk\" (UID: \"aecdfa6c-405a-4826-b85c-ed166499eb7e\") " pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:03.813225 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.813206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/aecdfa6c-405a-4826-b85c-ed166499eb7e-data\") pod \"seaweedfs-86cc847c5c-ntxzk\" (UID: \"aecdfa6c-405a-4826-b85c-ed166499eb7e\") " pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:03.821226 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.821195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf9m8\" (UniqueName: \"kubernetes.io/projected/aecdfa6c-405a-4826-b85c-ed166499eb7e-kube-api-access-lf9m8\") pod \"seaweedfs-86cc847c5c-ntxzk\" (UID: \"aecdfa6c-405a-4826-b85c-ed166499eb7e\") " pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:03.841082 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.841050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:03.860855 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.860829 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:03.928943 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.927948 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:03.982865 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.982819 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-r9hnx"] Apr 17 11:27:03.986029 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:27:03.985982 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973ef5f8_86db_4e45_8197_f0c91a379e10.slice/crio-e046040cccefd17b9bc44af4aefcd4aea5c49ed63781750c6ab364f6d95fd19c WatchSource:0}: Error finding container e046040cccefd17b9bc44af4aefcd4aea5c49ed63781750c6ab364f6d95fd19c: Status 404 returned error can't find the container with id e046040cccefd17b9bc44af4aefcd4aea5c49ed63781750c6ab364f6d95fd19c Apr 17 11:27:03.989847 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:03.988764 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:27:04.015466 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:04.015440 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf"] Apr 17 11:27:04.018066 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:27:04.018001 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podad4c06e7_e478_40cd_ac98_906086b21c59.slice/crio-136efef4e2e804c0306576128ceb0ed5311b27049660578aa3fd2972eaa291e4 WatchSource:0}: Error finding container 136efef4e2e804c0306576128ceb0ed5311b27049660578aa3fd2972eaa291e4: Status 404 returned error can't find the container with id 136efef4e2e804c0306576128ceb0ed5311b27049660578aa3fd2972eaa291e4 Apr 17 11:27:04.067483 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:04.067461 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-ntxzk"] Apr 17 11:27:04.068277 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:27:04.068255 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaecdfa6c_405a_4826_b85c_ed166499eb7e.slice/crio-08c8eb87ccedbef24c48518a6cc4d6521d1c73c03ee6d2757bb0aa520f5d60db WatchSource:0}: Error finding container 08c8eb87ccedbef24c48518a6cc4d6521d1c73c03ee6d2757bb0aa520f5d60db: Status 404 returned error can't find the container with id 08c8eb87ccedbef24c48518a6cc4d6521d1c73c03ee6d2757bb0aa520f5d60db Apr 17 11:27:04.198443 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:04.198403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-ntxzk" event={"ID":"aecdfa6c-405a-4826-b85c-ed166499eb7e","Type":"ContainerStarted","Data":"08c8eb87ccedbef24c48518a6cc4d6521d1c73c03ee6d2757bb0aa520f5d60db"} Apr 17 11:27:04.199491 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:04.199444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" event={"ID":"ad4c06e7-e478-40cd-ac98-906086b21c59","Type":"ContainerStarted","Data":"136efef4e2e804c0306576128ceb0ed5311b27049660578aa3fd2972eaa291e4"} Apr 17 11:27:04.200452 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:04.200427 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" event={"ID":"973ef5f8-86db-4e45-8197-f0c91a379e10","Type":"ContainerStarted","Data":"e046040cccefd17b9bc44af4aefcd4aea5c49ed63781750c6ab364f6d95fd19c"} Apr 17 11:27:09.223171 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.223112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" event={"ID":"ad4c06e7-e478-40cd-ac98-906086b21c59","Type":"ContainerStarted","Data":"0eb5f89af640a235508ac52aa2068b56d6fd16d13b1d0e281ba89148cfc67722"} Apr 17 11:27:09.223679 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.223226 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:09.224684 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.224653 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" event={"ID":"973ef5f8-86db-4e45-8197-f0c91a379e10","Type":"ContainerStarted","Data":"4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb"} Apr 17 11:27:09.224825 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.224763 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:09.226091 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.226071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-ntxzk" event={"ID":"aecdfa6c-405a-4826-b85c-ed166499eb7e","Type":"ContainerStarted","Data":"cfe9d35669b7741737cb2eb3e9eb01bee54221d6c6de0432d590708c301a6048"} Apr 17 11:27:09.226209 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.226187 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:09.243246 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.243189 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" podStartSLOduration=1.4838603510000001 podStartE2EDuration="6.243175557s" podCreationTimestamp="2026-04-17 11:27:03 +0000 UTC" firstStartedPulling="2026-04-17 11:27:04.019136407 +0000 UTC m=+652.557058637" lastFinishedPulling="2026-04-17 11:27:08.778451632 +0000 UTC m=+657.316373843" observedRunningTime="2026-04-17 11:27:09.242177858 +0000 UTC m=+657.780100092" watchObservedRunningTime="2026-04-17 11:27:09.243175557 +0000 UTC m=+657.781097790" Apr 17 11:27:09.259141 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.259094 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-ntxzk" podStartSLOduration=1.490385198 podStartE2EDuration="6.259080763s" podCreationTimestamp="2026-04-17 11:27:03 +0000 UTC" firstStartedPulling="2026-04-17 11:27:04.069542222 +0000 UTC m=+652.607464432" lastFinishedPulling="2026-04-17 11:27:08.838237768 +0000 UTC m=+657.376159997" observedRunningTime="2026-04-17 11:27:09.25683058 +0000 UTC m=+657.794752814" watchObservedRunningTime="2026-04-17 11:27:09.259080763 +0000 UTC m=+657.797002995" Apr 17 11:27:09.272210 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:09.272158 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" podStartSLOduration=1.6378137050000001 podStartE2EDuration="6.272145718s" podCreationTimestamp="2026-04-17 11:27:03 +0000 UTC" firstStartedPulling="2026-04-17 11:27:03.988951028 +0000 UTC m=+652.526873239" lastFinishedPulling="2026-04-17 11:27:08.623283028 +0000 UTC m=+657.161205252" observedRunningTime="2026-04-17 11:27:09.270573585 +0000 UTC m=+657.808495829" watchObservedRunningTime="2026-04-17 11:27:09.272145718 +0000 UTC m=+657.810067950" Apr 17 11:27:15.233026 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:15.232996 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-ntxzk" Apr 17 11:27:40.232183 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:40.232151 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5d56b85c4d-2gtxf" Apr 17 11:27:40.235252 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:40.235212 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:41.549108 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.549075 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-r9hnx"] Apr 17 11:27:41.549533 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.549287 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" podUID="973ef5f8-86db-4e45-8197-f0c91a379e10" containerName="manager" containerID="cri-o://4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb" gracePeriod=10 Apr 17 11:27:41.574499 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.574468 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-t2jlx"] Apr 17 11:27:41.578382 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.578346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:41.583922 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.583893 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-t2jlx"] Apr 17 11:27:41.628594 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.628562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2-cert\") pod \"kserve-controller-manager-7dcb9f9f85-t2jlx\" (UID: \"4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:41.628746 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.628685 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4vcz\" (UniqueName: \"kubernetes.io/projected/4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2-kube-api-access-v4vcz\") pod \"kserve-controller-manager-7dcb9f9f85-t2jlx\" (UID: \"4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:41.729650 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.729588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4vcz\" (UniqueName: \"kubernetes.io/projected/4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2-kube-api-access-v4vcz\") pod \"kserve-controller-manager-7dcb9f9f85-t2jlx\" (UID: \"4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:41.729864 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.729716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2-cert\") pod \"kserve-controller-manager-7dcb9f9f85-t2jlx\" (UID: \"4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:41.732429 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.732403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2-cert\") pod \"kserve-controller-manager-7dcb9f9f85-t2jlx\" (UID: \"4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:41.738722 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.738684 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4vcz\" (UniqueName: \"kubernetes.io/projected/4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2-kube-api-access-v4vcz\") pod \"kserve-controller-manager-7dcb9f9f85-t2jlx\" (UID: \"4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:41.794315 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.794290 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:41.830261 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.830179 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/973ef5f8-86db-4e45-8197-f0c91a379e10-cert\") pod \"973ef5f8-86db-4e45-8197-f0c91a379e10\" (UID: \"973ef5f8-86db-4e45-8197-f0c91a379e10\") " Apr 17 11:27:41.830261 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.830235 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9962t\" (UniqueName: \"kubernetes.io/projected/973ef5f8-86db-4e45-8197-f0c91a379e10-kube-api-access-9962t\") pod \"973ef5f8-86db-4e45-8197-f0c91a379e10\" (UID: \"973ef5f8-86db-4e45-8197-f0c91a379e10\") " Apr 17 11:27:41.832788 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.832751 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973ef5f8-86db-4e45-8197-f0c91a379e10-kube-api-access-9962t" (OuterVolumeSpecName: "kube-api-access-9962t") pod "973ef5f8-86db-4e45-8197-f0c91a379e10" (UID: "973ef5f8-86db-4e45-8197-f0c91a379e10"). InnerVolumeSpecName "kube-api-access-9962t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:27:41.832920 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.832815 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973ef5f8-86db-4e45-8197-f0c91a379e10-cert" (OuterVolumeSpecName: "cert") pod "973ef5f8-86db-4e45-8197-f0c91a379e10" (UID: "973ef5f8-86db-4e45-8197-f0c91a379e10"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:27:41.931609 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.931567 2577 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/973ef5f8-86db-4e45-8197-f0c91a379e10-cert\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:27:41.931609 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.931598 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9962t\" (UniqueName: \"kubernetes.io/projected/973ef5f8-86db-4e45-8197-f0c91a379e10-kube-api-access-9962t\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:27:41.946558 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:41.946521 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:42.075763 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.075732 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-t2jlx"] Apr 17 11:27:42.077173 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:27:42.077126 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb70e99_f4d7_4019_80c8_3fe81dd3e2e2.slice/crio-1372a5e6f776ab4e9b56468b2fbc9e4a659caf48550a9c8faa3ae13669919eb4 WatchSource:0}: Error finding container 1372a5e6f776ab4e9b56468b2fbc9e4a659caf48550a9c8faa3ae13669919eb4: Status 404 returned error can't find the container with id 1372a5e6f776ab4e9b56468b2fbc9e4a659caf48550a9c8faa3ae13669919eb4 Apr 17 11:27:42.342196 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.342105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" event={"ID":"4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2","Type":"ContainerStarted","Data":"1372a5e6f776ab4e9b56468b2fbc9e4a659caf48550a9c8faa3ae13669919eb4"} Apr 17 11:27:42.343243 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.343215 2577 generic.go:358] "Generic (PLEG): container finished" podID="973ef5f8-86db-4e45-8197-f0c91a379e10" containerID="4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb" exitCode=0 Apr 17 11:27:42.343348 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.343272 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" event={"ID":"973ef5f8-86db-4e45-8197-f0c91a379e10","Type":"ContainerDied","Data":"4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb"} Apr 17 11:27:42.343348 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.343298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" event={"ID":"973ef5f8-86db-4e45-8197-f0c91a379e10","Type":"ContainerDied","Data":"e046040cccefd17b9bc44af4aefcd4aea5c49ed63781750c6ab364f6d95fd19c"} Apr 17 11:27:42.343348 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.343301 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-r9hnx" Apr 17 11:27:42.343348 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.343315 2577 scope.go:117] "RemoveContainer" containerID="4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb" Apr 17 11:27:42.352005 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.351987 2577 scope.go:117] "RemoveContainer" containerID="4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb" Apr 17 11:27:42.352310 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:27:42.352287 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb\": container with ID starting with 4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb not found: ID does not exist" containerID="4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb" Apr 17 11:27:42.352431 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.352320 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb"} err="failed to get container status \"4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb\": rpc error: code = NotFound desc = could not find container \"4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb\": container with ID starting with 4971caff677b14f1e8d1b27dd99f4ed86bd7b6c561d2f40eb1a710b5e68ef9fb not found: ID does not exist" Apr 17 11:27:42.359817 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.359787 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-r9hnx"] Apr 17 11:27:42.363052 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:42.363028 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-r9hnx"] Apr 17 11:27:43.349567 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:43.349531 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" event={"ID":"4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2","Type":"ContainerStarted","Data":"0f3119db6b08cf830a45affecebdfcd65fe17c91ba61d9c1f190a868a4102441"} Apr 17 11:27:43.349942 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:43.349667 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:27:43.365395 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:43.365313 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" podStartSLOduration=1.892455088 podStartE2EDuration="2.365295399s" podCreationTimestamp="2026-04-17 11:27:41 +0000 UTC" firstStartedPulling="2026-04-17 11:27:42.078468162 +0000 UTC m=+690.616390372" lastFinishedPulling="2026-04-17 11:27:42.551308471 +0000 UTC m=+691.089230683" observedRunningTime="2026-04-17 11:27:43.363405049 +0000 UTC m=+691.901327285" watchObservedRunningTime="2026-04-17 11:27:43.365295399 +0000 UTC m=+691.903217632" Apr 17 11:27:44.066632 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:27:44.066596 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973ef5f8-86db-4e45-8197-f0c91a379e10" path="/var/lib/kubelet/pods/973ef5f8-86db-4e45-8197-f0c91a379e10/volumes" Apr 17 11:28:14.358858 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:14.358826 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-t2jlx" Apr 17 11:28:15.222596 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.222551 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-xnwjm"] Apr 17 11:28:15.223003 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.222983 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="973ef5f8-86db-4e45-8197-f0c91a379e10" containerName="manager" Apr 17 11:28:15.223095 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.223004 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="973ef5f8-86db-4e45-8197-f0c91a379e10" containerName="manager" Apr 17 11:28:15.223095 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.223081 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="973ef5f8-86db-4e45-8197-f0c91a379e10" containerName="manager" Apr 17 11:28:15.227324 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.227298 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:15.229222 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.229184 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 11:28:15.229346 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.229275 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-pnkxh\"" Apr 17 11:28:15.237762 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.237738 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-xnwjm"] Apr 17 11:28:15.241999 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.241977 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-c7tvm"] Apr 17 11:28:15.245183 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.245168 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:15.250305 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.250281 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 11:28:15.250453 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.250398 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-9sh5w\"" Apr 17 11:28:15.258320 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.258295 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-c7tvm"] Apr 17 11:28:15.311902 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.311865 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmmj\" (UniqueName: \"kubernetes.io/projected/8a987357-70f3-403f-a84b-2253cae492f0-kube-api-access-wpmmj\") pod \"odh-model-controller-696fc77849-c7tvm\" (UID: \"8a987357-70f3-403f-a84b-2253cae492f0\") " pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:15.311902 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.311903 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzc2q\" (UniqueName: \"kubernetes.io/projected/82a4489c-26f5-49f3-a824-7830c58c2865-kube-api-access-wzc2q\") pod \"model-serving-api-86f7b4b499-xnwjm\" (UID: \"82a4489c-26f5-49f3-a824-7830c58c2865\") " pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:15.312119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.311949 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a987357-70f3-403f-a84b-2253cae492f0-cert\") pod \"odh-model-controller-696fc77849-c7tvm\" (UID: \"8a987357-70f3-403f-a84b-2253cae492f0\") " pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:15.312119 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.312014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82a4489c-26f5-49f3-a824-7830c58c2865-tls-certs\") pod \"model-serving-api-86f7b4b499-xnwjm\" (UID: \"82a4489c-26f5-49f3-a824-7830c58c2865\") " pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:15.412799 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.412748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82a4489c-26f5-49f3-a824-7830c58c2865-tls-certs\") pod \"model-serving-api-86f7b4b499-xnwjm\" (UID: \"82a4489c-26f5-49f3-a824-7830c58c2865\") " pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:15.413201 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.412830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmmj\" (UniqueName: \"kubernetes.io/projected/8a987357-70f3-403f-a84b-2253cae492f0-kube-api-access-wpmmj\") pod \"odh-model-controller-696fc77849-c7tvm\" (UID: \"8a987357-70f3-403f-a84b-2253cae492f0\") " pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:15.413201 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.412849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzc2q\" (UniqueName: \"kubernetes.io/projected/82a4489c-26f5-49f3-a824-7830c58c2865-kube-api-access-wzc2q\") pod \"model-serving-api-86f7b4b499-xnwjm\" (UID: \"82a4489c-26f5-49f3-a824-7830c58c2865\") " pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:15.413201 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.412883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a987357-70f3-403f-a84b-2253cae492f0-cert\") pod \"odh-model-controller-696fc77849-c7tvm\" (UID: \"8a987357-70f3-403f-a84b-2253cae492f0\") " pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:15.413201 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:28:15.412903 2577 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 17 11:28:15.413201 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:28:15.412977 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82a4489c-26f5-49f3-a824-7830c58c2865-tls-certs podName:82a4489c-26f5-49f3-a824-7830c58c2865 nodeName:}" failed. No retries permitted until 2026-04-17 11:28:15.91296012 +0000 UTC m=+724.450882330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/82a4489c-26f5-49f3-a824-7830c58c2865-tls-certs") pod "model-serving-api-86f7b4b499-xnwjm" (UID: "82a4489c-26f5-49f3-a824-7830c58c2865") : secret "model-serving-api-tls" not found Apr 17 11:28:15.413201 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:28:15.412982 2577 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 11:28:15.413201 ip-10-0-128-205 kubenswrapper[2577]: E0417 11:28:15.413023 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a987357-70f3-403f-a84b-2253cae492f0-cert podName:8a987357-70f3-403f-a84b-2253cae492f0 nodeName:}" failed. No retries permitted until 2026-04-17 11:28:15.913008806 +0000 UTC m=+724.450931016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a987357-70f3-403f-a84b-2253cae492f0-cert") pod "odh-model-controller-696fc77849-c7tvm" (UID: "8a987357-70f3-403f-a84b-2253cae492f0") : secret "odh-model-controller-webhook-cert" not found Apr 17 11:28:15.425447 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.425419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmmj\" (UniqueName: \"kubernetes.io/projected/8a987357-70f3-403f-a84b-2253cae492f0-kube-api-access-wpmmj\") pod \"odh-model-controller-696fc77849-c7tvm\" (UID: \"8a987357-70f3-403f-a84b-2253cae492f0\") " pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:15.428464 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.428437 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzc2q\" (UniqueName: \"kubernetes.io/projected/82a4489c-26f5-49f3-a824-7830c58c2865-kube-api-access-wzc2q\") pod \"model-serving-api-86f7b4b499-xnwjm\" (UID: \"82a4489c-26f5-49f3-a824-7830c58c2865\") " pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:15.917212 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.917167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82a4489c-26f5-49f3-a824-7830c58c2865-tls-certs\") pod \"model-serving-api-86f7b4b499-xnwjm\" (UID: \"82a4489c-26f5-49f3-a824-7830c58c2865\") " pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:15.917445 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.917280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a987357-70f3-403f-a84b-2253cae492f0-cert\") pod \"odh-model-controller-696fc77849-c7tvm\" (UID: \"8a987357-70f3-403f-a84b-2253cae492f0\") " pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:15.919894 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.919867 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/82a4489c-26f5-49f3-a824-7830c58c2865-tls-certs\") pod \"model-serving-api-86f7b4b499-xnwjm\" (UID: \"82a4489c-26f5-49f3-a824-7830c58c2865\") " pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:15.919894 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:15.919890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a987357-70f3-403f-a84b-2253cae492f0-cert\") pod \"odh-model-controller-696fc77849-c7tvm\" (UID: \"8a987357-70f3-403f-a84b-2253cae492f0\") " pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:16.138624 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:16.138593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:16.156110 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:16.156082 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:16.274805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:16.274758 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-xnwjm"] Apr 17 11:28:16.277265 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:28:16.277233 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82a4489c_26f5_49f3_a824_7830c58c2865.slice/crio-3b8245dacac2582eb663e1d4a7df62a3c951be778e476bc116ba6aeaa460b668 WatchSource:0}: Error finding container 3b8245dacac2582eb663e1d4a7df62a3c951be778e476bc116ba6aeaa460b668: Status 404 returned error can't find the container with id 3b8245dacac2582eb663e1d4a7df62a3c951be778e476bc116ba6aeaa460b668 Apr 17 11:28:16.305565 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:16.305541 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-c7tvm"] Apr 17 11:28:16.307979 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:28:16.307953 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a987357_70f3_403f_a84b_2253cae492f0.slice/crio-d1df85f0653ace9dc037fa09d387f0356fb5a83aa5fad91afdb82192b25f0692 WatchSource:0}: Error finding container d1df85f0653ace9dc037fa09d387f0356fb5a83aa5fad91afdb82192b25f0692: Status 404 returned error can't find the container with id d1df85f0653ace9dc037fa09d387f0356fb5a83aa5fad91afdb82192b25f0692 Apr 17 11:28:16.477860 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:16.477828 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-c7tvm" event={"ID":"8a987357-70f3-403f-a84b-2253cae492f0","Type":"ContainerStarted","Data":"d1df85f0653ace9dc037fa09d387f0356fb5a83aa5fad91afdb82192b25f0692"} Apr 17 11:28:16.478898 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:16.478868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-xnwjm" event={"ID":"82a4489c-26f5-49f3-a824-7830c58c2865","Type":"ContainerStarted","Data":"3b8245dacac2582eb663e1d4a7df62a3c951be778e476bc116ba6aeaa460b668"} Apr 17 11:28:19.493128 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:19.493084 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-c7tvm" event={"ID":"8a987357-70f3-403f-a84b-2253cae492f0","Type":"ContainerStarted","Data":"5b94c9b5abbc3aff1c99d672f5b76145b3f2041bc4b7a160a40ba30728a1267d"} Apr 17 11:28:19.493632 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:19.493310 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:19.494613 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:19.494584 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-xnwjm" event={"ID":"82a4489c-26f5-49f3-a824-7830c58c2865","Type":"ContainerStarted","Data":"685f4537705f38b83e24b469bb34609a5c70d2208b99595a67ac41bea883875e"} Apr 17 11:28:19.494726 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:19.494675 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:19.510161 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:19.510114 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-c7tvm" podStartSLOduration=1.711884169 podStartE2EDuration="4.510099735s" podCreationTimestamp="2026-04-17 11:28:15 +0000 UTC" firstStartedPulling="2026-04-17 11:28:16.309254586 +0000 UTC m=+724.847176798" lastFinishedPulling="2026-04-17 11:28:19.107470151 +0000 UTC m=+727.645392364" observedRunningTime="2026-04-17 11:28:19.507837159 +0000 UTC m=+728.045759393" watchObservedRunningTime="2026-04-17 11:28:19.510099735 +0000 UTC m=+728.048021967" Apr 17 11:28:19.524860 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:19.524808 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-xnwjm" podStartSLOduration=1.75531973 podStartE2EDuration="4.524794154s" podCreationTimestamp="2026-04-17 11:28:15 +0000 UTC" firstStartedPulling="2026-04-17 11:28:16.279303152 +0000 UTC m=+724.817225366" lastFinishedPulling="2026-04-17 11:28:19.048777579 +0000 UTC m=+727.586699790" observedRunningTime="2026-04-17 11:28:19.522762552 +0000 UTC m=+728.060684786" watchObservedRunningTime="2026-04-17 11:28:19.524794154 +0000 UTC m=+728.062716385" Apr 17 11:28:30.502081 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:30.502054 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-c7tvm" Apr 17 11:28:30.503746 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:30.503727 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-xnwjm" Apr 17 11:28:31.276079 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:31.276047 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-8qd6s"] Apr 17 11:28:31.279429 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:31.279409 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8qd6s" Apr 17 11:28:31.284900 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:31.284870 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-8qd6s"] Apr 17 11:28:31.467629 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:31.467588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2lp\" (UniqueName: \"kubernetes.io/projected/6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2-kube-api-access-tf2lp\") pod \"s3-init-8qd6s\" (UID: \"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2\") " pod="kserve/s3-init-8qd6s" Apr 17 11:28:31.568327 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:31.568241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2lp\" (UniqueName: \"kubernetes.io/projected/6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2-kube-api-access-tf2lp\") pod \"s3-init-8qd6s\" (UID: \"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2\") " pod="kserve/s3-init-8qd6s" Apr 17 11:28:31.576778 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:31.576750 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2lp\" (UniqueName: \"kubernetes.io/projected/6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2-kube-api-access-tf2lp\") pod \"s3-init-8qd6s\" (UID: \"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2\") " pod="kserve/s3-init-8qd6s" Apr 17 11:28:31.589499 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:31.589475 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8qd6s" Apr 17 11:28:31.738440 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:31.738414 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-8qd6s"] Apr 17 11:28:31.740043 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:28:31.740014 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f4ef5a8_b8e1_45e0_af03_73f1f04f7cf2.slice/crio-aae23707f2a514eecfab21c4719ea93140e3b554ae2bb8933776bc3fdcba0a17 WatchSource:0}: Error finding container aae23707f2a514eecfab21c4719ea93140e3b554ae2bb8933776bc3fdcba0a17: Status 404 returned error can't find the container with id aae23707f2a514eecfab21c4719ea93140e3b554ae2bb8933776bc3fdcba0a17 Apr 17 11:28:32.549416 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:32.549347 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8qd6s" event={"ID":"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2","Type":"ContainerStarted","Data":"aae23707f2a514eecfab21c4719ea93140e3b554ae2bb8933776bc3fdcba0a17"} Apr 17 11:28:36.568599 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:36.568507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8qd6s" event={"ID":"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2","Type":"ContainerStarted","Data":"6d2404efe7f6aa28201406230bc53c0264ced1f5783c9e4448d982134efd084d"} Apr 17 11:28:36.583785 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:36.583738 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-8qd6s" podStartSLOduration=1.090534832 podStartE2EDuration="5.583721728s" podCreationTimestamp="2026-04-17 11:28:31 +0000 UTC" firstStartedPulling="2026-04-17 11:28:31.741712742 +0000 UTC m=+740.279634952" lastFinishedPulling="2026-04-17 11:28:36.234899638 +0000 UTC m=+744.772821848" observedRunningTime="2026-04-17 11:28:36.582429842 +0000 UTC m=+745.120352072" watchObservedRunningTime="2026-04-17 11:28:36.583721728 +0000 UTC m=+745.121643960" Apr 17 11:28:39.580614 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:39.580528 2577 generic.go:358] "Generic (PLEG): container finished" podID="6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2" containerID="6d2404efe7f6aa28201406230bc53c0264ced1f5783c9e4448d982134efd084d" exitCode=0 Apr 17 11:28:39.580614 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:39.580601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8qd6s" event={"ID":"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2","Type":"ContainerDied","Data":"6d2404efe7f6aa28201406230bc53c0264ced1f5783c9e4448d982134efd084d"} Apr 17 11:28:40.712999 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:40.712970 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8qd6s" Apr 17 11:28:40.734079 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:40.734053 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf2lp\" (UniqueName: \"kubernetes.io/projected/6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2-kube-api-access-tf2lp\") pod \"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2\" (UID: \"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2\") " Apr 17 11:28:40.736257 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:40.736227 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2-kube-api-access-tf2lp" (OuterVolumeSpecName: "kube-api-access-tf2lp") pod "6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2" (UID: "6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2"). InnerVolumeSpecName "kube-api-access-tf2lp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:28:40.835045 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:40.834959 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tf2lp\" (UniqueName: \"kubernetes.io/projected/6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2-kube-api-access-tf2lp\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:28:41.589011 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:41.588974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8qd6s" event={"ID":"6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2","Type":"ContainerDied","Data":"aae23707f2a514eecfab21c4719ea93140e3b554ae2bb8933776bc3fdcba0a17"} Apr 17 11:28:41.589011 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:41.589011 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aae23707f2a514eecfab21c4719ea93140e3b554ae2bb8933776bc3fdcba0a17" Apr 17 11:28:41.589011 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:41.588989 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8qd6s" Apr 17 11:28:45.702997 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.702962 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k5n87/must-gather-bhrhp"] Apr 17 11:28:45.703489 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.703330 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2" containerName="s3-init" Apr 17 11:28:45.703489 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.703343 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2" containerName="s3-init" Apr 17 11:28:45.703489 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.703442 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2" containerName="s3-init" Apr 17 11:28:45.706688 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.706663 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:28:45.709254 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.709229 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k5n87\"/\"kube-root-ca.crt\"" Apr 17 11:28:45.709380 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.709288 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k5n87\"/\"openshift-service-ca.crt\"" Apr 17 11:28:45.709616 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.709603 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-k5n87\"/\"default-dockercfg-crjc6\"" Apr 17 11:28:45.714875 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.714849 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k5n87/must-gather-bhrhp"] Apr 17 11:28:45.768768 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.768724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rf2\" (UniqueName: \"kubernetes.io/projected/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-kube-api-access-94rf2\") pod \"must-gather-bhrhp\" (UID: \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\") " pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:28:45.768954 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.768828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-must-gather-output\") pod \"must-gather-bhrhp\" (UID: \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\") " pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:28:45.869508 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.869470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94rf2\" (UniqueName: \"kubernetes.io/projected/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-kube-api-access-94rf2\") pod \"must-gather-bhrhp\" (UID: \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\") " pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:28:45.869678 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.869557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-must-gather-output\") pod \"must-gather-bhrhp\" (UID: \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\") " pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:28:45.869882 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.869865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-must-gather-output\") pod \"must-gather-bhrhp\" (UID: \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\") " pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:28:45.877233 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:45.877208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rf2\" (UniqueName: \"kubernetes.io/projected/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-kube-api-access-94rf2\") pod \"must-gather-bhrhp\" (UID: \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\") " pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:28:46.017075 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:46.016988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:28:46.153790 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:46.153763 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k5n87/must-gather-bhrhp"] Apr 17 11:28:46.156659 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:28:46.156617 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7b52e8_6ee2_47a3_b2e5_c92d1fd946d1.slice/crio-fdd0035d625c87893a343fb9fa336a17911a249d857c29b793d3ebc5b067a0af WatchSource:0}: Error finding container fdd0035d625c87893a343fb9fa336a17911a249d857c29b793d3ebc5b067a0af: Status 404 returned error can't find the container with id fdd0035d625c87893a343fb9fa336a17911a249d857c29b793d3ebc5b067a0af Apr 17 11:28:46.609546 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:46.609514 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k5n87/must-gather-bhrhp" event={"ID":"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1","Type":"ContainerStarted","Data":"fdd0035d625c87893a343fb9fa336a17911a249d857c29b793d3ebc5b067a0af"} Apr 17 11:28:51.640709 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:51.640596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k5n87/must-gather-bhrhp" event={"ID":"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1","Type":"ContainerStarted","Data":"9b7f330d77d2985ca05e589cd0e0f068fadc8538cbe10b668d6f1169c2666346"} Apr 17 11:28:51.641078 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:51.640718 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k5n87/must-gather-bhrhp" event={"ID":"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1","Type":"ContainerStarted","Data":"3effe8b1ca433bcf775a80c8f57c2f9fc916e8295d75cb4f74b37f0ac93e09c6"} Apr 17 11:28:51.657756 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:28:51.657683 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k5n87/must-gather-bhrhp" podStartSLOduration=2.05201454 podStartE2EDuration="6.65766414s" podCreationTimestamp="2026-04-17 11:28:45 +0000 UTC" firstStartedPulling="2026-04-17 11:28:46.158431082 +0000 UTC m=+754.696353292" lastFinishedPulling="2026-04-17 11:28:50.764080679 +0000 UTC m=+759.302002892" observedRunningTime="2026-04-17 11:28:51.654479382 +0000 UTC m=+760.192401649" watchObservedRunningTime="2026-04-17 11:28:51.65766414 +0000 UTC m=+760.195586375" Apr 17 11:29:06.507625 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:06.507592 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq_77c235b0-17b5-4957-9e6c-91eb76b011d6/istio-proxy/0.log" Apr 17 11:29:06.539642 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:06.539610 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cbc658598-mr5gb_ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c/router/0.log" Apr 17 11:29:07.271023 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:07.270991 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq_77c235b0-17b5-4957-9e6c-91eb76b011d6/istio-proxy/0.log" Apr 17 11:29:07.294599 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:07.294575 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cbc658598-mr5gb_ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c/router/0.log" Apr 17 11:29:08.370553 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:08.370520 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-rdcm6_9e4787cd-4dfd-4599-84a6-cf70483badf1/manager/0.log" Apr 17 11:29:08.381069 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:08.381039 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-88szw_717c4b78-672e-4ce2-bf9c-92108ff3a520/limitador/0.log" Apr 17 11:29:08.396528 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:08.396497 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rwb7v_0dff2bcc-dd83-4dfd-921a-c93d43c03722/manager/0.log" Apr 17 11:29:11.723455 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:11.723427 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerID="3effe8b1ca433bcf775a80c8f57c2f9fc916e8295d75cb4f74b37f0ac93e09c6" exitCode=0 Apr 17 11:29:11.723891 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:11.723479 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k5n87/must-gather-bhrhp" event={"ID":"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1","Type":"ContainerDied","Data":"3effe8b1ca433bcf775a80c8f57c2f9fc916e8295d75cb4f74b37f0ac93e09c6"} Apr 17 11:29:11.723891 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:11.723822 2577 scope.go:117] "RemoveContainer" containerID="3effe8b1ca433bcf775a80c8f57c2f9fc916e8295d75cb4f74b37f0ac93e09c6" Apr 17 11:29:12.150497 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:12.150401 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k5n87_must-gather-bhrhp_0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1/gather/0.log" Apr 17 11:29:15.498058 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:15.498024 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-87nfm_3715a6cc-4533-4b7c-b268-85902d84afd1/global-pull-secret-syncer/0.log" Apr 17 11:29:15.608912 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:15.608886 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pgpql_2fa87114-d537-4e80-b08a-605d0566022a/konnectivity-agent/0.log" Apr 17 11:29:15.626626 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:15.626598 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-205.ec2.internal_8a71fa9642bbb7713db79711084fe6ff/haproxy/0.log" Apr 17 11:29:17.514436 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.514397 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k5n87/must-gather-bhrhp"] Apr 17 11:29:17.514911 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.514607 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-k5n87/must-gather-bhrhp" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerName="copy" containerID="cri-o://9b7f330d77d2985ca05e589cd0e0f068fadc8538cbe10b668d6f1169c2666346" gracePeriod=2 Apr 17 11:29:17.517272 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.517238 2577 status_manager.go:895] "Failed to get status for pod" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" pod="openshift-must-gather-k5n87/must-gather-bhrhp" err="pods \"must-gather-bhrhp\" is forbidden: User \"system:node:ip-10-0-128-205.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-k5n87\": no relationship found between node 'ip-10-0-128-205.ec2.internal' and this object" Apr 17 11:29:17.519324 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.518894 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k5n87/must-gather-bhrhp"] Apr 17 11:29:17.747383 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.747342 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k5n87_must-gather-bhrhp_0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1/copy/0.log" Apr 17 11:29:17.747687 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.747666 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerID="9b7f330d77d2985ca05e589cd0e0f068fadc8538cbe10b668d6f1169c2666346" exitCode=143 Apr 17 11:29:17.747760 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.747710 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd0035d625c87893a343fb9fa336a17911a249d857c29b793d3ebc5b067a0af" Apr 17 11:29:17.755876 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.755858 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k5n87_must-gather-bhrhp_0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1/copy/0.log" Apr 17 11:29:17.756182 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.756168 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:29:17.757892 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.757870 2577 status_manager.go:895] "Failed to get status for pod" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" pod="openshift-must-gather-k5n87/must-gather-bhrhp" err="pods \"must-gather-bhrhp\" is forbidden: User \"system:node:ip-10-0-128-205.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-k5n87\": no relationship found between node 'ip-10-0-128-205.ec2.internal' and this object" Apr 17 11:29:17.861309 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.861213 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rf2\" (UniqueName: \"kubernetes.io/projected/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-kube-api-access-94rf2\") pod \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\" (UID: \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\") " Apr 17 11:29:17.861502 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.861323 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-must-gather-output\") pod \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\" (UID: \"0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1\") " Apr 17 11:29:17.863656 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.863618 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-kube-api-access-94rf2" (OuterVolumeSpecName: "kube-api-access-94rf2") pod "0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" (UID: "0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1"). InnerVolumeSpecName "kube-api-access-94rf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:29:17.866006 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.865984 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" (UID: "0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:29:17.961995 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.961956 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-must-gather-output\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:29:17.961995 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:17.961989 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94rf2\" (UniqueName: \"kubernetes.io/projected/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1-kube-api-access-94rf2\") on node \"ip-10-0-128-205.ec2.internal\" DevicePath \"\"" Apr 17 11:29:18.066948 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:18.066915 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" path="/var/lib/kubelet/pods/0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1/volumes" Apr 17 11:29:18.750962 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:18.750935 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k5n87/must-gather-bhrhp" Apr 17 11:29:19.633116 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:19.633090 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-rdcm6_9e4787cd-4dfd-4599-84a6-cf70483badf1/manager/0.log" Apr 17 11:29:19.668867 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:19.668842 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-88szw_717c4b78-672e-4ce2-bf9c-92108ff3a520/limitador/0.log" Apr 17 11:29:19.698915 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:19.698876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rwb7v_0dff2bcc-dd83-4dfd-921a-c93d43c03722/manager/0.log" Apr 17 11:29:21.173572 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:21.173546 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k9kpt_7d8c1d69-d085-43d0-8ee2-384f6a278430/node-exporter/0.log" Apr 17 11:29:21.193428 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:21.193402 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k9kpt_7d8c1d69-d085-43d0-8ee2-384f6a278430/kube-rbac-proxy/0.log" Apr 17 11:29:21.222273 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:21.222243 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k9kpt_7d8c1d69-d085-43d0-8ee2-384f6a278430/init-textfile/0.log" Apr 17 11:29:24.058805 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.058776 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b8985c5dc-nxw2k_c68de20a-f162-46df-a719-461537d94ab4/console/0.log" Apr 17 11:29:24.749039 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.748994 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22"] Apr 17 11:29:24.749520 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.749490 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerName="gather" Apr 17 11:29:24.749520 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.749513 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerName="gather" Apr 17 11:29:24.749520 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.749526 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerName="copy" Apr 17 11:29:24.749791 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.749532 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerName="copy" Apr 17 11:29:24.749791 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.749588 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerName="copy" Apr 17 11:29:24.749791 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.749601 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e7b52e8-6ee2-47a3-b2e5-c92d1fd946d1" containerName="gather" Apr 17 11:29:24.754651 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.754635 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:24.757023 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.757002 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2skc\"/\"kube-root-ca.crt\"" Apr 17 11:29:24.757146 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.757112 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2skc\"/\"openshift-service-ca.crt\"" Apr 17 11:29:24.757658 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.757640 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d2skc\"/\"default-dockercfg-6vh64\"" Apr 17 11:29:24.762250 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.762220 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22"] Apr 17 11:29:24.920270 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.920233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jr69\" (UniqueName: \"kubernetes.io/projected/868fbd41-befe-4c60-be63-128e092c9e2d-kube-api-access-6jr69\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:24.920270 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.920277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-proc\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:24.920541 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.920305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-sys\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:24.920541 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.920432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-lib-modules\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:24.920541 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:24.920488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-podres\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.021855 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.021760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-proc\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.021855 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.021819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-sys\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.021855 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.021854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-lib-modules\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.022111 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.021892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-podres\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.022111 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.021895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-proc\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.022111 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.021959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jr69\" (UniqueName: \"kubernetes.io/projected/868fbd41-befe-4c60-be63-128e092c9e2d-kube-api-access-6jr69\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.022111 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.021965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-sys\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.022111 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.021998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-podres\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.022111 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.022044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/868fbd41-befe-4c60-be63-128e092c9e2d-lib-modules\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.030483 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.030458 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jr69\" (UniqueName: \"kubernetes.io/projected/868fbd41-befe-4c60-be63-128e092c9e2d-kube-api-access-6jr69\") pod \"perf-node-gather-daemonset-hfx22\" (UID: \"868fbd41-befe-4c60-be63-128e092c9e2d\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.065331 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.065298 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.192296 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.192272 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22"] Apr 17 11:29:25.194525 ip-10-0-128-205 kubenswrapper[2577]: W0417 11:29:25.194495 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod868fbd41_befe_4c60_be63_128e092c9e2d.slice/crio-c76ecd8e55bf45cc86ccb7e7bb92584d68135f6ab37334cf6aa2e017eff05eb9 WatchSource:0}: Error finding container c76ecd8e55bf45cc86ccb7e7bb92584d68135f6ab37334cf6aa2e017eff05eb9: Status 404 returned error can't find the container with id c76ecd8e55bf45cc86ccb7e7bb92584d68135f6ab37334cf6aa2e017eff05eb9 Apr 17 11:29:25.324700 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.324598 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-crscn_142cbae1-73ac-4077-9d7f-b3393da4de44/dns/0.log" Apr 17 11:29:25.347495 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.347470 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-crscn_142cbae1-73ac-4077-9d7f-b3393da4de44/kube-rbac-proxy/0.log" Apr 17 11:29:25.414815 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.414785 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5rdwf_440a3dee-fa32-4dd9-8f44-d5532ff12996/dns-node-resolver/0.log" Apr 17 11:29:25.779426 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.779389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" event={"ID":"868fbd41-befe-4c60-be63-128e092c9e2d","Type":"ContainerStarted","Data":"139711db13feb19b2e46a75ead6cc24fad9eaea8aea78245f8a1f2add4ad237c"} Apr 17 11:29:25.779608 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.779436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" event={"ID":"868fbd41-befe-4c60-be63-128e092c9e2d","Type":"ContainerStarted","Data":"c76ecd8e55bf45cc86ccb7e7bb92584d68135f6ab37334cf6aa2e017eff05eb9"} Apr 17 11:29:25.779608 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.779467 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:25.796470 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.796421 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" podStartSLOduration=1.796407564 podStartE2EDuration="1.796407564s" podCreationTimestamp="2026-04-17 11:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:29:25.794062894 +0000 UTC m=+794.331985138" watchObservedRunningTime="2026-04-17 11:29:25.796407564 +0000 UTC m=+794.334329793" Apr 17 11:29:25.894792 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.894753 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-9d9f47c87-m5vrb_76638c14-f3f9-4a5a-88fc-a5cd6da627ed/registry/0.log" Apr 17 11:29:25.917451 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:25.917418 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-btbxk_0a040950-ccaf-4d81-8e53-7c50e5eca541/node-ca/0.log" Apr 17 11:29:26.759511 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:26.759478 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-lzxtq_77c235b0-17b5-4957-9e6c-91eb76b011d6/istio-proxy/0.log" Apr 17 11:29:26.782534 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:26.782499 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cbc658598-mr5gb_ad9fdeae-eb5e-4ef6-8d92-555c9c355b3c/router/0.log" Apr 17 11:29:27.264253 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:27.264226 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kxgxj_9f51a76d-b37b-4ecc-8919-ea7f1f06e2cb/serve-healthcheck-canary/0.log" Apr 17 11:29:27.734336 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:27.734311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8h88w_54915c0c-e586-4854-9937-807160b46bb8/kube-rbac-proxy/0.log" Apr 17 11:29:27.755402 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:27.755374 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8h88w_54915c0c-e586-4854-9937-807160b46bb8/exporter/0.log" Apr 17 11:29:27.776686 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:27.776656 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8h88w_54915c0c-e586-4854-9937-807160b46bb8/extractor/0.log" Apr 17 11:29:30.974172 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:30.974096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7dcb9f9f85-t2jlx_4fb70e99-f4d7-4019-80c8-3fe81dd3e2e2/manager/0.log" Apr 17 11:29:30.999137 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:30.999107 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5d56b85c4d-2gtxf_ad4c06e7-e478-40cd-ac98-906086b21c59/manager/0.log" Apr 17 11:29:31.020297 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:31.020274 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-xnwjm_82a4489c-26f5-49f3-a824-7830c58c2865/server/0.log" Apr 17 11:29:31.049646 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:31.049619 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-c7tvm_8a987357-70f3-403f-a84b-2253cae492f0/manager/0.log" Apr 17 11:29:31.066381 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:31.066342 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-8qd6s_6f4ef5a8-b8e1-45e0-af03-73f1f04f7cf2/s3-init/0.log" Apr 17 11:29:31.091865 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:31.091833 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-ntxzk_aecdfa6c-405a-4826-b85c-ed166499eb7e/seaweedfs/0.log" Apr 17 11:29:31.793542 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:31.793517 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-hfx22" Apr 17 11:29:35.579821 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:35.579795 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vgmt2_f882661d-584f-41d1-9758-7e68f8c80cc5/migrator/0.log" Apr 17 11:29:35.600078 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:35.600049 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vgmt2_f882661d-584f-41d1-9758-7e68f8c80cc5/graceful-termination/0.log" Apr 17 11:29:35.989950 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:35.989924 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6sx6h_241708e9-6c54-4758-aa09-fa52e406c967/kube-storage-version-migrator-operator/1.log" Apr 17 11:29:35.990748 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:35.990731 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6sx6h_241708e9-6c54-4758-aa09-fa52e406c967/kube-storage-version-migrator-operator/0.log" Apr 17 11:29:37.454711 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.454685 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ww4kd_43cab76d-7c1c-49f8-8a36-79896bc24bdc/kube-multus-additional-cni-plugins/0.log" Apr 17 11:29:37.475313 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.475289 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ww4kd_43cab76d-7c1c-49f8-8a36-79896bc24bdc/egress-router-binary-copy/0.log" Apr 17 11:29:37.494720 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.494695 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ww4kd_43cab76d-7c1c-49f8-8a36-79896bc24bdc/cni-plugins/0.log" Apr 17 11:29:37.514905 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.514877 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ww4kd_43cab76d-7c1c-49f8-8a36-79896bc24bdc/bond-cni-plugin/0.log" Apr 17 11:29:37.538001 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.537974 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ww4kd_43cab76d-7c1c-49f8-8a36-79896bc24bdc/routeoverride-cni/0.log" Apr 17 11:29:37.557699 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.557672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ww4kd_43cab76d-7c1c-49f8-8a36-79896bc24bdc/whereabouts-cni-bincopy/0.log" Apr 17 11:29:37.580730 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.580704 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ww4kd_43cab76d-7c1c-49f8-8a36-79896bc24bdc/whereabouts-cni/0.log" Apr 17 11:29:37.611903 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.611876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wzjb7_432289f7-2cea-4a47-8acc-2a378b04716a/kube-multus/0.log" Apr 17 11:29:37.674731 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.674696 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dn4mx_ecbf8c24-6e0b-4d26-9530-6bcc59825ca0/network-metrics-daemon/0.log" Apr 17 11:29:37.692598 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:37.692570 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dn4mx_ecbf8c24-6e0b-4d26-9530-6bcc59825ca0/kube-rbac-proxy/0.log" Apr 17 11:29:38.552610 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.552584 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-controller/0.log" Apr 17 11:29:38.570126 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.570098 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/0.log" Apr 17 11:29:38.574118 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.574096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovn-acl-logging/1.log" Apr 17 11:29:38.593043 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.593017 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/kube-rbac-proxy-node/0.log" Apr 17 11:29:38.614597 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.614575 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:29:38.634204 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.634171 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/northd/0.log" Apr 17 11:29:38.653065 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.653033 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/nbdb/0.log" Apr 17 11:29:38.673658 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.673613 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/sbdb/0.log" Apr 17 11:29:38.777573 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:38.777538 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hzwl5_554f8dad-a601-4855-910a-f1e99d5cf979/ovnkube-controller/0.log" Apr 17 11:29:40.528157 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:40.528130 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-mqf7t_43381832-f484-4811-93b2-5d729f55a9c7/check-endpoints/0.log" Apr 17 11:29:40.580145 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:40.580115 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-pb4l4_5848b99b-c76c-47de-b92e-288c830c8a96/network-check-target-container/0.log" Apr 17 11:29:41.619594 ip-10-0-128-205 kubenswrapper[2577]: I0417 11:29:41.619560 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qbctw_6421a751-16fb-48d9-b12a-268fc1c823b1/iptables-alerter/0.log"